The key to ranking well in Google is optimizing the visible keywords on a page. As I mentioned in Part 1 of this series, some webmasters back in the early days of SEO thought that stuffing all available areas full of keywords was the way to rank highly in the search engines. However, these early black hat SEO techniques weren't completely successful because the keywords often didn't match the actual content on the page and mislead both search engines and users. However, those early black hat SEO techniques were almost on the right path because keyword optimization is important in helping your site rank well in search engine queries.
In this installment, you'll learn the white hat SEO top-left-down method of keyword optimization that helps you choose and optimize the right keywords for your site. And you'll get information on other white hat SEO strategies, and learn how to address infrastructure concerns to improve your site's attractiveness to search engines.
You only need to take two steps for a successful keyword strategy campaign:
- Keyword selection: Determine what your pages offer. Then determine which words your potential audience might use to search for your page and create keywords based on those words.
- Keyword optimization: Apply these keywords to the appropriate pages (3 - 5 keywords per page is the recommended amount) and optimize from the top left, and then down. Often this will be the first 200 words on your page -- title tag, headings, abstract, and such.
Basically the closer to the top left your keywords are placed, the more weight Google gives to them. I call this the top-left-down strategy of keyword optimization. Users will initially view your Web site the same way the spider does, so emphasizing the keywords from the top-left-down is a good Web design practice as well (see Eye tracking and search behavior).
To succeed with your keyword strategy, your best bet is to place your keywords near the top of your page. Many other factors come into play for the entire search engine ranking, including -- but not limited to -- your links in and out, your use of redirects, and other infrastructure concerns. But the first step of SEO is choosing your keywords.
The most important task of keyword optimization is to determine whether your pages are optimized for and rank highly in search engines for keywords that people actually want to find. If no one is searching for your targeted keywords, it won't matter how highly the site is ranked in the search engines. This is actually the first part of any keyword strategy -- and a step that is often overlooked.
What if your pages are already built?
You've probably overlooked this first step, and are now looking at your live Web pages and wondering if it's too late to select your keywords. It's not too late. You can determine your keywords before or after the page is live, although I recommend doing it beforehand so you don't need to rewrite the text on your page.
If your Web pages are already live, you have keywords. But they might not be the right ones. Or they may be the right ones but aren't fully optimized. You can still use a thorough keyword selection process to make sure you optimize the right keywords.
Also, you can run live pages through keyword analysis tools to get a better idea of how the search engines see the pages. (See Resources for a list of page analysis tools.)
If you've already built your pages and think you have already chosen appropriate keywords, you can skip ahead to Rank checking to see if your pages are fully optimized.
Many specialized SEO tools can help you determine the popularity and the competitiveness of your possible keywords (see more on SEO keyword tools). The main ideas to keep in mind as you create your keyword list are:
- Popularity: Do people search for your keywords?
- Competitiveness: How many other pages target your keywords? Should you have more specific keywords?
Let's look at the developerWorks Web site as an example of keyword selection. The main page (www.ibm.com/developerworks/) of the Web site is the number-one result in Google for a search on "developerWorks." That's not bad news for us, yet it isn't an SEO success story, either. If a user was searching for developerWorks, he or she would already know where to go. Our target readers for the developerWorks main page are developers who are looking for resources on the many technologies and brands that IBM supports. The keywords we target are "IBM resource developers" because we decided this broad page matches the more broadly searching potential audience.
To reach the audience that is "foraging for information" (see Jakob Neilson in Part 1) through search engine queries, you need to determine who isn't finding your page, but should be. You want to optimize your page for the foragers.
Another concern for keyword selection is to determine whether the keyword is too popular or competitive. If too many pages compete for high rankings using your keywords, you might need to select more specific keywords. This is true also for keywords that may have several meanings. You must examine how users may search for your page. What specific question does the content of your page answer? Refine your keywords to answer these questions.
If thousands of pages surface in the SERPs for the keyword that you think describes your page the best, you need to think of how the searcher might deal with this situation. A searcher doesn't want to click through page after page of SERPs; instead, he or she will probably type in a second word or change the term entirely. You need to figure out what is specific about your pages that is different from your competitors, but searchable. You can always start with more general keywords if your top-level pages provide overviews for varied types of content and then be more specific on more specialized second-level pages deeper in the directory.
For example, a search for "java" in Google retrieves everything from coffee to the geographical location. But a search for "java technology tutorials" returns the developerWorks Java technical library (www.ibm.com/developerworks/views/java/library.jsp), which is chock-full of links to Java™ tutorials. This is why more specific keywords can help you reach a user looking for a specific type of Web page.
More on keyword refinement
When you refine your keywords, keep in mind that a large portion of searches are three words or more. As people search for answers, they often phrase their search terms as questions. Searchers won't search for words that describe the solution. To optimize your pages, remember to think like a searcher.
Also remember that one size does not fit all when optimizing pages. If you find that each of your pages has the same keywords, then maybe you need to change your Web pages to make them more specific. Each page needs to reflect how users gather information: often from broader to more specific. More specific pages in your navigation should have more specific keywords.
If you have trouble determining your page's keywords, talk to the people who write the content for the pages on your site and learn more about the pages so you can decide what keywords are suitable. Ask yourself what questions your Web site answers. If you don't know how the nonmarketing world discusses the problems that your Web pages can solve, visit discussion forums or blogs that cover the same topics.
Once you select your keywords, it's time to apply them to your Web pages. The page text is the most important aspect of your page to the search engines. Search engines give more weight to the titles and headings, or emphasized text. That's why the first 200 words are so important. It's also the basis of the top-left-down keyword optimization strategy. Besides the 200 words and emphasized text on your page, concentrate on making sure your body text contains keywords.
Here's what you should optimize and why:
- Title tag (<title>): All SEO experts agree the title tag is the most important tag on the page. It's the first word or words the spider encounters and it's the title of your page's listing in the SERPs. Use this tag to your advantage; that is, make sure your keywords are listed here.
- Page heading and subheadings: The page heading (also called page title) and subheadings (also called subtitles) are the next most important features on the page. The page headings and subheadings should describe the page in a way that's relevant to users as well as search engines. Some sites use graphics for these important tags -- but spiders can't read graphics, so these crucial tags are wasted if you use graphics instead of text.
- Abstract: Besides the title and page headings, your page should have some text that describes what the page is about. This will be pulled into the listing for your page on the SERPs, so make it keyword-rich in a way that pleases spiders and users. Make the most of this area so you're using the top 200 words of your page to the best advantage.
- Major headings and terms in bold and italics: Words in major heading tags and words in bold and italics will also influence your ranking. These tags tell the user the highlighted terms are important to the page, and the spider sees them the same way. Make sure the emphasized words are keyword rich.
- Body of the text: Don't forget to ensure your keywords are in the text. If you have trouble doing this, you might have chosen the wrong keywords.
- Hyperlinks: If you link to your own page, use the hot words for a link to describe the link with the keywords. Don't use the URL as the hot link.
The first 200 words and the most emphasized words on your page should be keywords. After that, ensure that your body text contains keywords.
Keyword refinement and optimization in practice
I'll use the developerWorks Windows to Linux Roadmap overview page (www.ibm.com/developerworks/linux/library/l-roadmap.html) as an example. This roadmap is for developers who want to migrate from Windows to Linux, and the overview page is optimized broadly for terms that a developer might use in a search. I've highlighted the keywords as they appear in the top tags and top 200 words.
Figure 1. Linux roadmap with keywords highlighted
In the code sample below, I pulled out all the pertinent coding from the html and the top 200 words after the title, page heading, and subheading:
<title>Windows-to-Linux roadmap: Overview</title> <h1>Windows-to-Linux roadmap: Overview</h1> <em>A roadmap for developers making the transition to Linux</em> <p>Level: Introductory</p> <p>Chris Walden (<a href="mailto:email@example.com">firstname.lastname@example.org</a>), e-business Architect, IBM<br /></p> <p> 11 Nov 2003</p> <blockquote>IBM e-business architect Chris Walden is your guide through a nine-part developerWorks series on moving your operational skills from a Windows® to a Linux® environment. He covers everything from logging to networking, and from the command-line to help systems -- even compiling packages from available source code.</blockquote> <p>You're moving from Windows to Linux. You've decided you want the stability, flexibility, and cost savings of Linux, but you have many questions in your head. Isn't Linux like UNIX? Isn't UNIX hard? Where do you begin to make sense of all of this? Is there a map you can follow?</p> <p>This roadmap is designed to help you take the experience and knowledge that you already have in computing and redirect it to working in Linux. It's not the only reference you'll ever need, but it will help you get past some of your first obstacles and adjust to a new and, I think, exciting approach to computing. As you follow this roadmap, you'll discover many new resources to help you learn, troubleshoot, and manage Linux.</p>
Note: In the preceding code example, the code was split to multiple lines for viewing purposes.
As the Linux roadmap becomes more detailed, the keywords become more specific. This overview page is optimized more broadly because it covers the entire series.
After you refine your keywords and optimize your pages, it's time for the task that makes up the bulk of all SEO work: rank checking. Once you examine your pages, check whether your page now ranks well for your keywords.
When you do rank checking, keep a record of what keywords you have searched for, which pages are listed, and what the ranking is in the SERPs. If you don't find any of your site's pages in the first three pages of the search engine results (or top 30 results), then consider it a loss and concentrate on what you can do to improve your ranking for these specific keywords.
If your pages don't do as well as you thought they would, you might need to refine your keywords or explore other factors that will influence your ranking in search engines.
I admit that organic SEO is more than a good keyword optimization strategy that employs the top-left-down method. However, keyword optimization is a valid white hat SEO practice because determining and focusing on what your pages are about improves the user experience. And once your pages are optimized, you never need to go back and change them as a reaction to algorithm changes in the search engines -- as you would with black hat SEO practices.
The following are just a few other SEO concerns. More detailed SEO improvements to infrastructure will be discussed in Parts 3 and 4 of the series.
Other SEO improvements include the following:
- Employ a good linking strategy.
Ranking improvements in the SERPs that are based on links from other sites are on a page-by-page basis. But a link doesn't necessarily help your page unless it's considered a "good link" by Google. To increase your rating from Google on a specific page and therefore its ranking in the SERPs, your page needs to be linked to from other highly ranked pages. You can determine which pages Google ranks highly by the positions they have in the SERPs for the keywords the pages are targeting. Some SEO experts use the PageRank rating in the Google Toolbar to determine the Google-worth of another page (see more on PageRank).
You can only really control your outbound links, so make sure they are good ones, however you determine the value of the links. And don't encourage spammy sites, such as obvious link farms, to link to your page.
When you link to your own pages, make sure the hot part of the link is the term the page is optimized for. For example, "Find tons more developer resources from developerWorks."
- Check for broken links and correct HTML.
If Google is spidering links to and through your site and encounters a broken link, it will stop the crawl. Make sure your HTML is well-formed and that all of your links work as they should. Site users will appreciate this as well.
- Use redirects sparingly.
Google's spider prefers to directly access content. Redirects can be interpreted by Google as spam techniques and possibly can be mistaken for a doorway page or cloaking. If the search engine spider has trouble traveling through your site because of improper or excessive redirecting, your rankings in the search engine will suffer -- so avoid a redirect unless it's absolutely necessary.
If you can't avoid using a redirect, try to use a 301 server redirect. A server redirect tells the Google spider that the page is moved permanently and to treat the page at the other end of the 301 as the new URL. The Google spider will choke on a meta refresh redirect and a 302 redirect can cause duplicate content penalties. Redirects will be covered more in-depth in Parts 3 and 4.
- Avoid URL parameters.
Numerous parameters are a more common issue with larger sites that use content management systems. The problem with parameters is that the spider might choke on URL characters such as ampersand (&).
Google says one to two parameters are the most you should try. Basically the longer and more complicated a URL is, the worse it is for search. You can think of it in terms of usability, too: The best URL is one that's easy to remember.
- Have good navigation.
Navigations are still very useful for spiders and people getting to your site through searches rather than by navigating through the site (the old-fashioned way). This is also known as "bottom-up search." Once people get into the site, you need to lure them deeper into the site with links and good navigation. A page that is search-friendly but has no links into the rest of your Web site doesn't make it easy for people to enter farther into your site.
- Minimize use of Flash and other graphics.
Using Flash for important aspects of your site or images instead of text isn't good at all for search engines or your potential users. A page reader can't read the Flash or the text -- and neither can the search engine.
- Avoid anything that makes it hard for the spider to crawl your site.
- Don't overdo it.
Now that you have all of this information, you might be tempted to stuff your top 200 words with keywords and forget the rest. But that isn't the white hat SEO way. Don't overoptimize the site only to get number-one rankings. Your goal is to create pages that will deliver the user from the search engine. But once your audience reaches your site, you want them to stay. Create your pages with a good user experience in mind, and your users will thank you -- and the search engines will reward you.
For more white hat SEO best practices, check out Google's requirements for Web masters in Resources. The more technical SEO concerns and fixes will be discussed in Parts 3 and 4 of this series.
SEO sites are filled with lists of strategies and guesses for what will ensure a top ranking in Google (and other search engines). You can check out Resources for a list of just a few. But good writing is the key to surfacing high in the search results and creating a page that is usable and accessible. Creating a well-reasoned keyword strategy is the most important SEO task you can perform to meet both the needs of spiders and your potential audience. You should be able to start optimizing immediately with the information from Parts 1 and 2 in this series.
In the final two parts of this series, you'll learn how to discover whether the search engines can find your site, and how to tell if your site was penalized by the search engines. You'll also receive tips for corporate SEO.
- Did-it, Enquiro, and Eyetools uncover Search's Golden Triangle:
Get all of the details on this eye-tracking study, and scroll down for other good search behavior research, including Gord Hotchkiss's white paper "Inside the Mind of the Searcher."
- Lessons Learned from Eye Tracking Studies: Follow up on the eye tracking study with Chris Sherman's analysis at Search Engine Watch.
- Overture keyword selector tool: Research popular keywords with this tool.
- Advanced Google Operators: Use these commands to inform your keyword strategies.
- Marketleap: With these browser-based tools, analyze your pages and rankings.
- Free browser-based tools: Explore several sites that offer SEO-related tools. As with anything free, you get what you pay for. Tools such as these are good for information gathering, but don't base your entire SEO campaign around information or suggestions from these tools.
- Google's Web master guidelines: Follow these recommendations to create a spider-able and user-friendly site.
- 26 Steps to 15k a Day : Check out WebmasterWorld CEO Brett Tabke's respected guidelines.
- Top Ten List Of Search Ranking Factors: Catch SEO expert Danny Sullivan's list.
- Search Engine Marketing, Inc.: Driving Search Traffic to Your Company's Web Site: Check out Bill Hunt and Mike Moran's book -- the title says it all.
Get products and technologies
- IBM trial software:
Build your next development project with software downloads available directly from developerWorks.
- Search Engine Watch forums: See what SEO experts say about SEO tools, PageRank, and other SEO concerns.
- High Rankings forums: Get advice on creating usable, search-engine friendly pages.
L. Jennette Banks has been with IBM developerWorks since 2000 as a Web Editor and worked on organic search engine optimization for developerWorks since 2001. When not optimizing the developerWorks Web site, she enjoys kittens, puppies, and long walks on the beach. Jennette lives with her partner, two ornery cats, and one giant fluffy dog in a small community outside of Research Triangle Park in North Carolina.