|How to conduct a Web site competitive analysis|
Conducting a competitive analysis is an important part of the job if you're a usability engineer or information architect. A good competitive analysis not only produces usability metrics but also aids decision makers in their strategic goal-setting and planning. Done right, a good competitive analysis can steer a Web development project in the right direction.
The day will come when you're sitting happily at your desk and someone from marketing or business development will come into your office and ask you to do a competitive analysis for them. The company is launching a news site or portal, and the decision makers want to be sure that their site will stand up to the competition.
Suddenly, you're not just in the world of usability and information architecture -- of theories and deep thinking about cognitive psychology. You're now in the rubber-meets-the-road world of business. Although you'll be doing old-fashioned usability analysis work, you're also expected to guide the team toward increasing return on investment. You're expected to provide baseline readings from which to measure success. And you're expected to help the team snoop out what the competition is doing.
If all this sounds a little out of your league, don't worry, because it isn't. Let's start with the basics.
First things first
Your audience will also expect a presentation and a written report. The presentation can knock the tops off the mountains, but the report better have some detail in it. They expect your findings to be well organized, moving from executive summary to appendixes loaded with relevant details.
The end result of your analysis is a decision -- a business decision that affects the rollout of design and development. Your recommendations could be as "trivial" as adding search functionality and a look-and-feel upgrade to an already crowded deployment schedule, or it could have more far-reaching ramifications, such as adding to a budget for content acquisition or a shift in messaging. That's why your conclusions are so important. Arriving at them is not just an academic exercise.
Next we'll discuss who and what you'll be analyzing.
Who's the competition?
Regardless, the list you get will likely be incomplete. That's because the people giving you the list will have their "business" hat on, not their "functionality" hat on. For example, if the company you're doing the analysis for is in the freight cargo business, you're likely to get a list of other sites or portals belonging to companies in the same business. However, it might be smart to add sites like travelocity.com, which specializes in consumer travel, because their site contains functionality that might be universal to all transportation applications (i.e., departure and destination points are common to freight trucks and airline customers).
Along with a list of competitors, you'll likely get a list of items that they want you to focus on, or at least, a list of items they want to do better than the competition. For example, the team might be fixated on the number of content items deployed on their own site. If Competitor X has 500 content items, they'll want to know how many content items Competitor Y and Competitor Z have. The subtext will be, "How fast can we have more content items?"
Resist any impulses to follow subtexts at this point. To follow our example, you might dig deeper and find out that those 500 content items deployed on Competitor X's site are outdated, badly written, and generally not useful to their audience.
If the company you're doing the analysis for doesn't know who the competition is, then you'll need to do some sleuthing. Find out the company's Standard Industrial Classification (SIC) code and then look up other companies in that same category. Try to find out what the company is striving to achieve with their own Web offering and match targets appropriately. Some relevant criteria for determining worthy adversaries are geographic location, total revenues, total profits, and strong branding.
If you're the one drawing up the list, always check with someone at the company who is in-the-know (usually someone in marketing or business development). This can save you lots of pain and heartache later, and could also save your credibility when you deliver the results of your work.
What to analyze
I provide a rating for each question on each site visited: 1=bad, 2=poor, 3=fair, 4=good, 5=outstanding. Naturally, you may want to tweak this scale to fit your needs, but it's important to have some kind of scale to make the job of comparison easier. The list of resources contains links to other criteria you can use.
Conducting the analysis
Here are some additional guidelines:
When you're ready, you'll need to do some number crunching. Although a discussion of statistical methods could easily fill several books (and has), there are, at minimum, a handful of important calculations to make for each site:
Together, these values (mean, median, mode, maximum value, minimum value, and spread) start to tell a story. They don't tell the whole story, but they certainly illustrate and make plain the results of your work. For example, Web sites that have means and medians that are far apart indicate more weight on extreme ends of the scale (either more 1s or 5s in the established rating system). Mode values that are significantly different from medians and/or means also indicate clumping of values away from the normal, expected curve. Web sites with large spreads between minimum and maximum values might indicate a high level of inconsistency in the different portions of the site; in other words, a site might have poor search functionality but excellent content organization and site navigation.
You must remember one thing: the numbers you assign to any part of a Web site are, as much as you'd hate to admit it, somewhat arbitrary. Although you may be an expert at usability or information architecture, any number of factors can cause bias to enter the process. You might be in a hurry, have a pressing deadline distracting you, or your mind may wander while you're finishing an evaluation. You might be evaluating a Web site belonging to a big competitor, and there may be some tacit pressure to downgrade any scores you give them.
Be as fair as you possibly can, and make it understood that the numbers you assign are subjective scores, not the results of ironclad science. They're assigned and used primarily to have something quantifiable to point to and discuss, instead of just guesses and raw opinion.
You can perform this task of crunching numbers manually or with a spreadsheet. Excel and other spreadsheet tools provide built-in functions for calculating means, medians, modes, and other statistical values.
Writing the report
Why? Because your report will be used by decision makers, and I don't mean as filler in their inbox, either. They'll read it, digest your findings and conclusions, and try to make decisions that affect company strategy -- or at least, Web site deployment strategy.
Writing a report isn't that difficult; in fact, it's about the easiest piece of writing that you'll ever undertake. Why? Because a report is very structured, and the structure can aid your writing. A good report shouldn't contain any surprising twists and turns. In fact, the readers of your report will be expecting something along these lines:
As for process, the best approach is to create a file in your favorite word processor and fill in all the headers that mark the sections. This sets up an informal outline that you can "fill in" as you go. My advice would be to write the methods section first, as you know what methodology you employed. Writing this section first will loosen you up and get the writing flowing.
Next, write the findings section. This section is the longest of the entire report and will take you at least a day, if not more, of solid work to complete. Once you've finished with the methods and findings sections, knock out your recommendations and then complete the executive summary.
Add the appendixes to the back, and let the report rest for at least a day. Then go through it again, from top to bottom, and clean up the verbiage. Remember that shorter is better. If you can say something in 10 words, find a way to say it in 7 or 8. Cut out as many adverbs and adjectives as possible. Remember that those reading your report will want to get to the heart of the matter and won't appreciate flowery language.
When you're happy with it, give the report to someone else and have them review it. Don't pick a pushover or someone who will return it with hardly any comments, either. Pick someone with a discriminating eye -- someone who will ask lots of questions and nitpick. The more you cover in your report, the less stupid you'll feel when you give your presentation.
Giving the presentation
When you give your presentation, avoid the impulse to talk to the slides. Instead, use the slides as visual confirmation of what you're saying. Speak with an easy, even tone, as though you were telling a group of friends something important.
I personally don't believe in using Powerpoint slides whenever I give a speech, but for this kind of presentation, you'll need a few well-chosen slides that highlight your findings and recommendations. My advice to you is to create 5-7 slides with bullet points and/or data tables for this purpose.
Start by introducing yourself and then launch into why you performed the competitive analysis. As with writing, providing this information first will loosen you up; after all, you know both of these topics very well.
Next, talk about your methodology, and get on to the findings as soon as you can. Don't give a blow-by-blow of each of your findings -- instead, summarize, and use visuals to punctuate your summaries. For example, instead of talking about each segment of each site, provide a summary of where each site succeeded and failed, and provide that information as a table on a slide.
Finally, follow with your recommendations, and then open up the floor to Q&A. With any luck, the process of analyzing competitor's sites, writing (and polishing) the report, and rehearsing your presentation will mean that you're well prepared for any and all questions. If you do get a question you don't know the answer for, don't squirm, equivocate, or sidestep. Tell the audience that you don't know the answer to that question and that you'll find out. Then follow up appropriately.
You can distribute the report as an email attachment or as hard copy at the meeting, or both. It's my opinion that handing out a hard copy report is good, as this gives the decision makers something tangible to hold. Don't give out copies of the report until the end of the presentation; otherwise, you risk having your audience looking at the report instead of listening to you.