Top

Practicing What You Preach: Experimenting and Iterating in UX Research

June 17, 2019

In the field of User Experience, design thinking, failing fast, and iterating are popular concepts. When developing new products and features, we need to learn continuously by ideating, experimenting, and refining.

Since joining Factual—a medium-sized startup—as their first UX researcher, I’ve faced a new challenge: building a brand-new research practice. I’ve relied on my core UX principles to help me succeed. I’m constantly trying things out, reflecting on how they’re working, then revising my approach as necessary. During my first ten months at Factual, I’ve experimented with new ideas and approaches, learned from my mistakes, and adjusted my processes as necessary. I’ve pivoted everything from my research projects to my recruiting efforts to my reporting techniques, depending on what is or isn’t working.

Champion Advertisement
Continue Reading…

Research Focus and Projects

At Factual, my goal is to establish their UX-research practice from the bottom up. So I needed to teach everyone at the company what UX research is, why it’s valuable, and how they can partner with me. When I joined the company, I needed to learn about an entirely new organization, industry, and product line, while simultaneously conducting research and demonstrating its value. Needless to say, I had my hands full!

Given Factual’s specialized user base and complex, rapidly expanding product offerings, I thought it would be best to begin with foundational research. I figured that personas and user journeys would help get everyone on the same page and establish a shared internal language. The Product Marketing team had put together some initial personas, but based them on their experience in the industry rather than user interviews. I wanted to do user research to either validate them or show the need for a research-based approach.

However, as I settled in, I realized that, while my colleagues were excited about UX research, they didn’t really know what it was. I wondered whether I should shift my focus from general foundational research for a wide range of stakeholders to concept testing for a few highly engaged stakeholders. This would let me generate quick, concrete research wins that I could use to demonstrate the value of research.

So I decided to try out this approach and conducted a round of concept testing on a somewhat mature product. First, I held a project kick-off meeting with the team to explain the process and timeline, then we worked together to determine the research goals and scope. The team was very engaged and attended my research sessions to experience user research first hand. Following the sessions, I put together a summary of the findings, and the team implemented the recommended changes. I could point to the need to make explicit design changes to address specific user feedback. I was pleased with my concept-testing sessions and energized by teaching people about UX research by showing, not telling them about it.

Shortly afterward, I learned that we were about to develop a brand-new product. So I decided to build on my initial success and introduce my colleagues to exploratory research. I conducted some early exploratory research with a wide range of users to determine key elements to include in the minimum-viable product (MVP). This approach delivered a win-win outcome: I received valuable feedback from cross-functional perspectives, while introducing my stakeholders to the research process.

Based on these exploratory interviews and a competitive-analysis exercise, the Design team and I determined the must-have features for our MVP. The Design team put together some initial mockups, then we prepared for a round of concept testing to see whether we were on the right track. After the research, I compiled the key findings and recommendations and worked with the team to determine what changes we should make.

We seemed to be making real progress, so I wanted to share our process and success more widely. I put together a slide deck chronicling our research and design process. It showed the designs before and after research, highlighting the key changes we made based on user feedback. I presented this case study at a company-wide, Friday-lunch talk, and this was a great moment. I finally felt that people had really begun to understand research and its value more clearly.

I definitely felt that pivoting from general, foundational research to targeted concept testing was a better approach for demonstrating quick, concrete, research wins. People needed to have a basic understanding of user research, which I could then slowly build on over time. I needed to dispel some of their preliminary skepticism and capitalize on their early enthusiasm.

Recruiting

In addition to strategically shifting my research focus and projects, I revised my recruiting efforts based on environmental feedback. Initially, I had asked Sales, Support, and Product to recommend research participants for me. However, my requests often went unanswered. Since this method wasn’t working, I decided to try a scrappier approach. I had learned that we logged internal call reports, in which associates detailed any calls with clients, so I began scouring them. I created a spreadsheet and recorded promising contacts’ names, as well as the products they used. Then, whenever I needed to do a study, I could search for relevant prospects and ask internal stakeholders whether certain clients would be a good fit.

Additionally, I started running internal pilots for my research studies. By doing this, I can collect valuable feedback, tweak my script and questions, build relationships across teams, and teach people the value of UX research through first-hand experience. Running pilot sessions internally helps me to provide full transparency into the research process, gain the trust of my colleagues, and increase internal awareness of UX research. After participating in pilot studies, internal associates are much more willing to recommend clients to contact about participating in our research.

I also started creating email templates for each study, so internal associates can easily reach out to clients. These templates include the product we’ll focus on, details about what to expect during the research session, and the benefits of participating in a research session. This approach has proven much more successful because internal associates are happy to help once I’ve done most of the legwork, laid out the details for them, and reduced their barriers to action. Over time, with thoughtful reflection and tweaks to my process, I’ve gained the trust of my teammates and greatly improved my recruiting success.

Reporting

Along with iterating on my research topics and recruiting strategies, I soon realized I needed to adapt how I presented my findings. In my previous role, we had compiled a research report and held formal readout sessions to discuss the findings and next steps. I noticed that this approach was too formal and sluggish for the rapid pace of my new company.

Instead, I decided to try compiling a summary document after each research study, including the key findings and recommendations. This worked much better than a formal report, but I wondered how I could spur more discussion around the findings. I was lucky to have engaged stakeholders, so I decided to hold a quick debrief with the team after each research session. These proved effective and let us discuss what we had heard from participants while it was still fresh in our minds. Then, after we had completed all of the sessions, I synthesized our debrief discussions and users’ overall feedback, compiling the key issues and action items that we needed to address in the summary document. Using Google Docs let me easily share the summary document and enabled real-time commenting and collaboration, which was necessary to sustain our fast pace.

I found that the post-interview debriefs and summary document worked well with highly engaged product teams, but I needed a different process for teams that were stretched thin and couldn’t attend all of my sessions. So I decided to try inputting comments directly into Figma, the design and prototyping tool we were using. This proved to be a great way to gather our notes, findings, and design suggestions in context. Design, Product, and Engineering all had full visibility into Figma and could read and respond to these comments. These in-context comments offered the added benefit of providing reminders, so my findings wouldn’t get lost or be ignored.

By moving from formal reports to a key-findings summary page to gathering in-context insights and recommendations, I experimented with different ideas, gathered feedback on what worked, and revised my approach accordingly.

Conclusion

In UX research, we collaborate with product teams to iterate on product designs and implementations based on user feedback. So I decided to apply the investigative, iterative, and adaptive skills I have learned as a UX researcher to develop a brand-new, UX-research practice. To do this, I needed to understand this particular company’s core needs and culture and develop responsive, effective processes and practices. Experimentation, observation, and openness to change have been crucial to my success. But I’m only just beginning. Learning what works best is a constant process, and I’m continually iterating and improving my user-research practice. 

UX Researcher at Factual

New York, New York, USA

Meghan WenzelMeghan is starting the UX Research team at Factual, a startup focusing on location data. She’s establishing research standards, processes, and metrics; building partnerships across teams, and leading research efforts across all products. Previously, she was a UX Researcher at ADP, where she conducted a wide range of exploratory, concept-testing, and usability research across products and platforms. She was also involved in ADP’s Come See for Yourself contextual-inquiry program, whose goal was to educate colleagues on the value of UX research and get them out into the field to talk to real users.  Read More

Other Articles on User Research

New on UXmatters