Reckon
Shaping Strategy with Survey Design
Role
Researcher
Timeline
April - July 2024
Project overview
To make informed product decisions at Reckon, we needed more than anecdotal feedback, we needed a structured, data-led understanding of our customers. While we had ongoing customer touchpoints through support and usability testing, we were missing the big picture.
Together with our Head of Research & Design, I led the launch of Reckon’s first annual customer survey: a company-wide initiative aimed at validating assumptions, surfacing pain points, and giving every team actionable insight into what mattered most to our customers.
My contribution
I collaborated closely with the Head of Research & Design, to lead and deliver the project. We shared facilitation of discovery workshops, combining our strengths throughout. I focused heavily on the technical execution, designing the survey, building out logic paths, and ensuring a seamless experience using Typeform, while also contributing to the strategic framing of research goals and analysis of results.
Reflections
This project was a major milestone in developing my end-to-end research capabilities. It sharpened my skills in survey design, research planning, and workshop facilitation. I gained confidence in crafting research approaches that meet business goals and in analysing and synthesising complex insights. Collaborating so closely with a senior leader was an incredible learning experience, giving me valuable exposure to strategic thinking and high-quality execution. As a bonus, our work was recognised by Typeform with a Typeform Award for Research Excellence in 2024.
You can read and (watch!) Typeform’s case study on our work here
Approach
Before drafting a single question, we met with 18 leaders across all department to understand what they hoped to learn from customers. Instead of asking “what should we ask?”, we asked:
What do you want to learn?
Who do you need to hear from?
How will you act on the insights?
This approach ensured that every part of the survey was intentional and reduced the risk of form fatigue.
The workshops resulted in 27 learning objectives, each tied to a hypothesis we intended to prove or disprove. We grouped these into a framework inspired by the customer journey: Entice, Enter, Engage, Exit, Extend. This structure helped us organise the survey logically and make the findings easier to interpret post-analysis.
Designing the survey
With our learning objectives in place, we were ready to start scripting our questions. We spent a significant amount of time ensuring each question was clear, unbiased, and directly tied to a hypothesis. Every question went through multiple rounds of review, with a focus on removing ambiguity, avoiding leading language, and keeping the tone approachable.
We ensured:
Relevance: Ensuring each customer saw only what was relevant to them.
Purpose: Every question had to map to a learning objective. If no one could act on the answer, it didn’t make the cut.
Methods: A blend of rating scales, rankings, and open text fields gave us both structured and qualitative insight.
Tone: We carefully crafted the survey language to keep it clear, neutral, and free from bias.
I then led the technical build in Typeform, using advanced logic paths and branching to tailor the experience based on each respondent’s answers. This ensured the survey was relevant, showing only what mattered to each user.
Launching & Monitoring
The survey was launched via a 3-part email campaign targeted at existing customers. We kept the messaging simple and clear, with a soft incentive: respondents were entered into a prize draw after completing the survey.
Throughout the live period, we closely monitored performance:
Tracked drop-off points and time-on-question to identify pain points
Made light edits mid-survey to improve flow
Reviewed incoming feedback regularly to validate our assumptions in real time
We received 10x more responses than forecast, with high engagement across all major customer segments. The volume and quality of open-text responses were particularly impressive, proving how invested Reckon users are when asked the right questions in the right way.
Analysing & Actioning the Insights
Once the survey closed, we started our analysis.We reviewed each learning objective and assessed whether the data supported or disproved the underlying hypothesis. We affinitised the open-text responses into key themes to uncover patterns, recurring pain points, and areas for improvement. We then compiled a comprehensive report for the business:
Each question was matched with relevant quantitative insights, key themes, and direct quotes
We highlighted where assumptions were proven wrong or right
Department leads were given the results relevant to them and tasked with turning the findings into action
Insights were also shared with the executive team and presented in a company-wide session. The feedback was overwhelmingly positive.
A Lasting Impact
The survey changed the way Reckon approaches research and customer understanding. Key outcomes included:
Support improvements: Features such as live chat and improved ticketing introduced based on common pain points.
Design efficiency: Feedback now acts as a baseline for new research projects, reducing duplication and sharpening focus
Roadmap validation: Survey results directly influenced feature prioritisation
Executive buy-in: The CEO called it “the most useful customer insight we've seen in years”
It’s now embedded as an annual initiative and continues to influence strategic planning, product development, and marketing. Teams across Reckon refer to it daily to understand customer needs and build stronger, evidence-backed assumptions.