Skip to main content
    Country/region [select]      Terms of use
     Home      Products      Services & solutions      Support & downloads      My account     

developerWorks > Web architecture >
How to conduct a Web site competitive analysis
60KBe-mail it!
First things first
Who's the competition?
What to analyze
Conducting the analysis
Writing the report
Giving the presentation
About the author
Rate this article
dW newsletters
dW Subscription
(CDs and downloads)
Hints and tips for doing it right

Thomas Myer (
Co-founder, Triple Dog Dare Media
01 Oct 2002

Conducting a competitive analysis is an important part of the job if you're a usability engineer or information architect. A good competitive analysis not only produces usability metrics but also aids decision makers in their strategic goal-setting and planning. Done right, a good competitive analysis can steer a Web development project in the right direction.

The day will come when you're sitting happily at your desk and someone from marketing or business development will come into your office and ask you to do a competitive analysis for them. The company is launching a news site or portal, and the decision makers want to be sure that their site will stand up to the competition.

Suddenly, you're not just in the world of usability and information architecture -- of theories and deep thinking about cognitive psychology. You're now in the rubber-meets-the-road world of business. Although you'll be doing old-fashioned usability analysis work, you're also expected to guide the team toward increasing return on investment. You're expected to provide baseline readings from which to measure success. And you're expected to help the team snoop out what the competition is doing.

If all this sounds a little out of your league, don't worry, because it isn't. Let's start with the basics.

First things first
The first thing to realize is that a Web site competitive analysis is usually performed for a team of business specialists who know nothing about design, usability, or information architecture. They don't have a clue about labeling systems, search ergonomics, or affordance. All they want to know is what the competition is doing and how they can do it better. Obviously, your expertise is in usability and user experience design, so you'll be evaluating sites along the lines of your domain expertise, but the data you gather must always point toward making a smart business decision.

Your audience will also expect a presentation and a written report. The presentation can knock the tops off the mountains, but the report better have some detail in it. They expect your findings to be well organized, moving from executive summary to appendixes loaded with relevant details.

The end result of your analysis is a decision -- a business decision that affects the rollout of design and development. Your recommendations could be as "trivial" as adding search functionality and a look-and-feel upgrade to an already crowded deployment schedule, or it could have more far-reaching ramifications, such as adding to a budget for content acquisition or a shift in messaging. That's why your conclusions are so important. Arriving at them is not just an academic exercise.

Next we'll discuss who and what you'll be analyzing.

Who's the competition?
It's very likely that you'll be given a list of competitors. Every company that has a handle on their market space knows who the competition is. And just about every company has a list of companies on their "target list" -- that special subset of companies that they want to beat soundly in the marketplace.

Regardless, the list you get will likely be incomplete. That's because the people giving you the list will have their "business" hat on, not their "functionality" hat on. For example, if the company you're doing the analysis for is in the freight cargo business, you're likely to get a list of other sites or portals belonging to companies in the same business. However, it might be smart to add sites like, which specializes in consumer travel, because their site contains functionality that might be universal to all transportation applications (i.e., departure and destination points are common to freight trucks and airline customers).

Along with a list of competitors, you'll likely get a list of items that they want you to focus on, or at least, a list of items they want to do better than the competition. For example, the team might be fixated on the number of content items deployed on their own site. If Competitor X has 500 content items, they'll want to know how many content items Competitor Y and Competitor Z have. The subtext will be, "How fast can we have more content items?"

Resist any impulses to follow subtexts at this point. To follow our example, you might dig deeper and find out that those 500 content items deployed on Competitor X's site are outdated, badly written, and generally not useful to their audience.

If the company you're doing the analysis for doesn't know who the competition is, then you'll need to do some sleuthing. Find out the company's Standard Industrial Classification (SIC) code and then look up other companies in that same category. Try to find out what the company is striving to achieve with their own Web offering and match targets appropriately. Some relevant criteria for determining worthy adversaries are geographic location, total revenues, total profits, and strong branding.

If you're the one drawing up the list, always check with someone at the company who is in-the-know (usually someone in marketing or business development). This can save you lots of pain and heartache later, and could also save your credibility when you deliver the results of your work.

What to analyze
Now that you have a list of competitors, you need to draw up a list of items to analyze when you visit their sites. I've developed a categorized list of items over the years, which are included below:

  • Home page. How informative is the home page? Does it set the proper context for visitors? Is it just an annoying splash page with multimedia? How fast does it load?
  • Navigation. Is the global navigation consistent from page to page? Do major sections have local navigation? Is it consistent?
  • Site organization. Is the site organization intuitive and easy to understand?
  • Links and labels. Are labels on section headers and content groupings easy to understand? Are links easy to distinguish from each other? Or are they ambiguous and uninformative ("click here" or "white paper")? Are links spread out in documents, or gathered conveniently in sidebars or other groupings?
  • Search and search results. Is the search engine easy to use? Are there basic and advanced search functions? What about search results? Are they organized and easy to understand? Do they give relevance weightings or provide context? Do the search results remind you what you searched for?
  • Readability. Is the font easy to read? Are line lengths acceptable? Is the site easy to scan, with chunked information, or is it just solid blocks of text?
  • Performance. Overall, do pages load slowly or quickly? Are graphics and applications like search and multimedia presentations optimized for easy Web viewing?
  • Content. Is their sufficient depth and breadth of content offerings? Does the content seem to match the mission of the organization and the needs of the audience? Is the site developing its own content or syndicating other sources? Is there a good mix of in-depth material (detailed case studies, articles, and white papers) versus superficial content (press releases, marketing copy)?

I provide a rating for each question on each site visited: 1=bad, 2=poor, 3=fair, 4=good, 5=outstanding. Naturally, you may want to tweak this scale to fit your needs, but it's important to have some kind of scale to make the job of comparison easier. The list of resources contains links to other criteria you can use.

Conducting the analysis
Now that you have a list of sites to visit and a list of criteria to compare, start your analysis. Be sure to conduct your analysis with some rigor. Don't be haphazard, and don't do things differently with each site visit. Try to analyze a site without interruption. In other words, do everything you can to reduce bias in your investigation.

Here are some additional guidelines:

  1. Visit one site at a time, and take the same (or at least, similar) paths through each site. Follow the checklist of criteria.
  2. For each criterion, take lots of notes. You'll refer to these notes when you organize and write your report.
  3. Try to give a score for each criterion as you complete them. That way you'll have scores for each major category as well as for each site.
  4. If the company that you're doing the analysis for has an existing site, then remember to rate them last. After visiting the company's competitors, this will give you some sense of objectivity. This also provides a good measurement comparison for the readers of your report.

When you're ready, you'll need to do some number crunching. Although a discussion of statistical methods could easily fill several books (and has), there are, at minimum, a handful of important calculations to make for each site:

  • Mean: The mean is derived by adding all values in a set and dividing by the number of items in the set. For example, in a data set comprising scores of 3, 4, 4, 5, 3, 2, and 5, the mean would be 26 / 7, or 3.72.
  • Median: The median is derived by lining up all values in a data set from smallest to largest and picking the one that's right in the middle. To continue our example, in a data set comprising values of 2, 3, 3, 4, 4, 4, and 5, the median would be 3.5 (with an odd number of values, split the two around the middle). Some feel that the median is a better representation of an "average" score, but I think that using both the mean and the median give you a better overall picture.
  • Mode: The mode is derived by calculating the highest frequency value in a data set. In our example data set, the mode would be 4 (there are more 4s than any other value).
  • Maximum, minimum, and spread: The maximum value in a data set is the largest value, and the minimum value is the smallest. The spread is the difference between these two values. To complete our example, the minimum value is 2, the maximum value is 5, and the spread is 3.

Together, these values (mean, median, mode, maximum value, minimum value, and spread) start to tell a story. They don't tell the whole story, but they certainly illustrate and make plain the results of your work. For example, Web sites that have means and medians that are far apart indicate more weight on extreme ends of the scale (either more 1s or 5s in the established rating system). Mode values that are significantly different from medians and/or means also indicate clumping of values away from the normal, expected curve. Web sites with large spreads between minimum and maximum values might indicate a high level of inconsistency in the different portions of the site; in other words, a site might have poor search functionality but excellent content organization and site navigation.

You must remember one thing: the numbers you assign to any part of a Web site are, as much as you'd hate to admit it, somewhat arbitrary. Although you may be an expert at usability or information architecture, any number of factors can cause bias to enter the process. You might be in a hurry, have a pressing deadline distracting you, or your mind may wander while you're finishing an evaluation. You might be evaluating a Web site belonging to a big competitor, and there may be some tacit pressure to downgrade any scores you give them.

Be as fair as you possibly can, and make it understood that the numbers you assign are subjective scores, not the results of ironclad science. They're assigned and used primarily to have something quantifiable to point to and discuss, instead of just guesses and raw opinion.

You can perform this task of crunching numbers manually or with a spreadsheet. Excel and other spreadsheet tools provide built-in functions for calculating means, medians, modes, and other statistical values.

Writing the report
Eventually, you'll need to take all your notes and all those numbers you've crunched and put them in a report. Most usability engineers and information architects I've met would rather do anything than write, but this is one case where what you write is as important than all the other work you've done.

Why? Because your report will be used by decision makers, and I don't mean as filler in their inbox, either. They'll read it, digest your findings and conclusions, and try to make decisions that affect company strategy -- or at least, Web site deployment strategy.

Writing a report isn't that difficult; in fact, it's about the easiest piece of writing that you'll ever undertake. Why? Because a report is very structured, and the structure can aid your writing. A good report shouldn't contain any surprising twists and turns. In fact, the readers of your report will be expecting something along these lines:

  • An executive summary, which contains a summary of your report. You'll probably write this section last. Subsections of the executive summary should include a section summarizing why you undertook the analysis, a summary of the sites' rankings, and a summary of recommendations for further action.
  • A methods section, in which you explain the methodology you employed for selecting and rating the sites, including what criteria you looked at. This section provides insight into your thinking when you undertook the analysis.
  • A findings section, in which you summarize your findings for each site. Start each subsection with the name of the site, the site's URL, and the overall score for the site. Then go through each part of the site and describe how it ranked, including a site section score. Do this for each site. The findings section will comprise the bulk of your report.
  • A discussion & recommendations section, in which you provide future direction for the team. This is the appropriate section to mention integrating other sites' best practices to the site being deployed by the company.
  • One or more appendixes, in which you provide detailed information. It's appropriate to list raw data of your findings here.

As for process, the best approach is to create a file in your favorite word processor and fill in all the headers that mark the sections. This sets up an informal outline that you can "fill in" as you go. My advice would be to write the methods section first, as you know what methodology you employed. Writing this section first will loosen you up and get the writing flowing.

Next, write the findings section. This section is the longest of the entire report and will take you at least a day, if not more, of solid work to complete. Once you've finished with the methods and findings sections, knock out your recommendations and then complete the executive summary.

Add the appendixes to the back, and let the report rest for at least a day. Then go through it again, from top to bottom, and clean up the verbiage. Remember that shorter is better. If you can say something in 10 words, find a way to say it in 7 or 8. Cut out as many adverbs and adjectives as possible. Remember that those reading your report will want to get to the heart of the matter and won't appreciate flowery language.

When you're happy with it, give the report to someone else and have them review it. Don't pick a pushover or someone who will return it with hardly any comments, either. Pick someone with a discriminating eye -- someone who will ask lots of questions and nitpick. The more you cover in your report, the less stupid you'll feel when you give your presentation.

Giving the presentation
Giving a presentation strikes more fear into people's hearts than writing does. Usually, this fear stems from nervousness, not knowing the subject matter, or fear of boring the audience. However, the kind of presentation you'll be giving isn't any cause for concern, because all the obstacles have been removed for you:

  • You know the subject matter intimately.
  • Your audience is genuinely interested in what you have to say.
  • The subject matter is bound to captivate the audience.

When you give your presentation, avoid the impulse to talk to the slides. Instead, use the slides as visual confirmation of what you're saying. Speak with an easy, even tone, as though you were telling a group of friends something important.

I personally don't believe in using Powerpoint slides whenever I give a speech, but for this kind of presentation, you'll need a few well-chosen slides that highlight your findings and recommendations. My advice to you is to create 5-7 slides with bullet points and/or data tables for this purpose.

Start by introducing yourself and then launch into why you performed the competitive analysis. As with writing, providing this information first will loosen you up; after all, you know both of these topics very well.

Next, talk about your methodology, and get on to the findings as soon as you can. Don't give a blow-by-blow of each of your findings -- instead, summarize, and use visuals to punctuate your summaries. For example, instead of talking about each segment of each site, provide a summary of where each site succeeded and failed, and provide that information as a table on a slide.

Finally, follow with your recommendations, and then open up the floor to Q&A. With any luck, the process of analyzing competitor's sites, writing (and polishing) the report, and rehearsing your presentation will mean that you're well prepared for any and all questions. If you do get a question you don't know the answer for, don't squirm, equivocate, or sidestep. Tell the audience that you don't know the answer to that question and that you'll find out. Then follow up appropriately.

You can distribute the report as an email attachment or as hard copy at the meeting, or both. It's my opinion that handing out a hard copy report is good, as this gives the decision makers something tangible to hold. Don't give out copies of the report until the end of the presentation; otherwise, you risk having your audience looking at the report instead of listening to you.

Conducting a competitive analysis is an important part of your job as a usability engineer or information architect. A good analysis and subsequent report can provide the necessary information to influence a decision regarding Web site deployment. Done right, a competitive analysis can steer a team in the right direction, as well as lend credibility to your career and position in the company.


About the author
Thomas Myer is the co-founder of Triple Dog Dare Media, an Austin, Texas based Web application development firm. For the past seven years, he's been involved in information architecture, Web application development, and interactive content development. He welcomes discussion at

60KBe-mail it!
Rate this article

This content was helpful to me:

Strongly disagree (1)Disagree (2)Neutral (3)Agree (4)Strongly agree (5)


developerWorks > Web architecture >
  About IBM  |  Privacy  |  Terms of use  |  Contact