Inspire action. Design research insights result from quality data, strong synthesis, thoughtful analysis, and articulate implications. High quality insights are never a travelog but should inform what the team will make next.
An assumption is something you think might be true but needs to be verified. Identifying and addressing assumptions is key to mitigating risk. Start with what you know and what you do not know. It can be easy to get bogged down in the “what ifs” or the “need to know everything” feeling. If that happens, try prioritizing what your assumptions and what you don’t know.
Address and evaluate your assumptions and questions using the Enterprise Design Thinking toolkit.
With a prioritized list in hand, validate your questions by reframing them as testable statements. For example, a broad research objective might be, “How long do users spend looking for content?” Try turning that into a provable statement such as: “Users need a more contextual way to find answers to complete everyday tasks.” Then, you can collect evidence that proves or disproves that statement.
If it is not testable, rewrite the statement until you can make a feasible research protocol. Assumptions need to be proven true or false with evidence to move your team forward with confidence.
Data collections describe the gathering, measurement, and control of information within a research activity. Raw data exists in many mediums: sticky notes, photographs, recordings, objects, survey answers, academic articles, customer complaints, error logs, analytics, and more. Choose the appropriate data collection and research method based on the needs of your team.
Keep this adage in mind : garbage in, garbage out. Gather data that reflects honest behaviors and reactions. Remember that the collection techniques used can influence how users respond. Asking users to answer difficult questions over the phone, face-to-face, or in a survey can elicit different answers.
Make an effort to collect replicable and consistent data that encompasses a broad range of real user experiences, but don’t ignore the outliers. While you cannot design for all extreme characteristics, your team can understand the entire spectrum of needs. With the outliers, you may find unexpected opportunities.
Clean your data and reduce human error before using. For quantitative data, remove inaccuracies from your data sets. You can clean qualitative data by screening participants before inviting them to a study or by removing a participant’s data from the collection before it’s synthesized.
Data synthesis involves the organization, categorization, and description of a body of data. Try organizing your data in a few different ways—like grouping, sorting, and tallying—to clarify information.
Sometimes, the right synthesis method can depend on the context and collection method used. For example, if you collected data through a series of phone interviews, you may choose affinity diagramming or coding to help you visually map the information in new ways. However, if you collected data through a quantitative method, a descriptive summary can be beneficial.
As a design researcher, avoid creating artifacts just for the sake of creating them. For example, how many times have you created a user journey map that was viewed once and then stored away in an email? Instead, use artifacts to help you make sense of the data you collect. Aim to constantly inform your offering’s roadmap. Bring research into the discussion rather than making it an item to complete on the team’s to-do list.
Data analysis seeks to uncover patterns and inferences from synthesized data and should breed quality insights your team can use.
After you organize your data, interpret it in a meaningful, evidence-based way. Explain the data by looking for patterns. Make sure that you address the root cause, not the symptoms. Find the factors that contribute to the main problem. Sometimes, it can be helpful to look for contradictions in your data and explore why they might occur.
There are a variety of tools and resources to help analyze data. One popular method for analyzing qualitative user data is an experience map. For quantitative data, try exploratory data analysis or inferential statistics.
Get started with the Enterprise Design Thinking toolkit.
Research insights come from your analyzed data. They should be original, non-obvious, and actionable. They often explain the “how” or “why” rather than the “what” or “when”, and they might reveal people’s true motivations beyond just their actions.
You will likely craft many insights as a result of your analyzed data. Prioritize the findings based on impact, and rank them to influence team decisions appropriately. Choose those insights that require the least expense but result in high impact. A team is limited in what it can deliver depending on bandwidth, skills, and resources and your prioritization might shift as understanding increases.
- What assumptions or questions do I have?
- Are there any opportunities to help my team synthesize?
- Do these insights explain the “how” rather than the “what?”
- Is my team making decisions based on assumptions or questions?
- Could I write any testable statements today?
- What’s the best way of synthesizing this data?
- What patterns do I see?