Evaluation + Benchmarking

We helped Country Fire Authority understand how their website was performing, and provided them with clear benchmarks and performance frameworks to confidently monitor and assess the impact of future website optimisations.
The brief
We’ve been working with Country Fire Authority on improving the accessibility, usability and readability of the content in the Plan and Prepare section of their website, through a range of content and UX/UI optimisations.
Ahead of going live with these changes, CFA wanted to capture and establish a clear understanding of the website’s current performance so that they could evaluate and assess the impact of this and other optimisation work over time.
What we did
We took a ‘mixed methods’ approach to measuring the performance of CFA’s current website experience. We gathered both qualitative and quantitative data, before providing CFA with a detailed guide to that data and how to monitor and evaluate the website into the future.
How we did it
Taking a mixed-methods approach
We adopted a mixed-methods approach to performance benchmarking the CFA website, using both qualitative and quantitative methods. The benefit of a mixed approach is that each methodology feeds into and can validate the findings of the other.
Quantitative benchmarking uses website data to assess the performance of website content and how it changes over time, or in response to website enhancement activities. It can help identify opportunities for further site improvement, which can then be validated and understood more thoroughly with qualitative data.
Qualitative benchmarking involves gathering in-depth insights through user research, such as interviews and usability tests. While qualitative benchmarking helps to gather some additional quantitative benchmark scores, it is more about understanding user behaviours, needs, and pain points to identify specific areas for improvement.
The findings of qualitative research complement and illuminate quantitative data, providing a richer, more human-centred view of website performance and the overall user experience.
Google Analytics review and set up
Before kicking off quantitative benchmarking, we reviewed CFA’s current Google Analytics setup, to ensure it was accurately tracking key performance metrics. This meant ensuring Google Analytics was configured to capture critical data such as user behaviour, traffic sources, and key events (like conversions). Doing this pre-work ensured the analytics framework was robust and capable of delivering actionable insights for future evaluation and website improvement efforts.
Quantitative Benchmarking
After reviewing CFA’s Google Analytics set up, we conducted quantitative benchmarking and established a set of key performance metrics for CFA to monitor over time.
Visitor engagement with the CFA’s website is greatly impacted by seasonality, so we measured each key performance metric across two different periods—Summer and Winter—to establish a relevant benchmark.
The key performance metrics (KPMs) we documented were:
- Engagement rate
- Session engagement
- Page engagement
- Views per sessions
- Sessions per user
- Weekly Average Users/Monthly Average Users
(An increase in these figures represents an increase in content performance.)
We also put together a set of secondary benchmarks and observations to further understand site utilisation:
- Total users
- Entrance rate
- Downloads per session
Regular quantitative assessment will allow CFA to identify further opportunities to improve their website.

Capturing performance under the new site structure
As part of the broader content redevelopment work we’ve done for CFA, we developed a new, more intuitive information architecture (site structure) and tested it with real users. So it was important that any quantitative benchmarking could be aligned to the new IA.
To make life easier for CFA, we provided them with two dynamic reports pulling data from CFA’s Google Analytics account—one for the current site structure and another for the new IA—so they can easily compare and contrast data from the original setup and the enhanced one once it goes live.
Qualitative research - usability testing
After establishing quantitative benchmarks for CFA to use in future website evaluations, we undertook usability testing with real users to validate our initial quantitative findings and provide further insights and context to the data—the ‘why’ behind the patterns and trends we observed.
In order to deeply understand the issues users face in the current website content and journeys, we ran a series of usability sessions with members of the public.
We recruited a sample audience that reflects CFA’s user base, which included research participants from regional, metropolitan and suburban fringe areas of Victoria, across a range of ages, as well at least one participant who spoke a language other than English at home. We also asked respondents to rate their fire safety knowledge so we could screen out anyone too knowledgeable.
In these sessions, we asked users to complete a set of tasks to observe how they used the website to reach common and/or important user goals.
The benchmarks we recorded during our qualitative usability testing sessions for the CFA website were:
- Task completion - how many participants were able to complete a given task.
- Time to page - the time it takes a participant to reach the success page (the page on the website where the task’s ‘answer’ lives)
- Time on task - the total time it takes for a participant to reach the success page and indicate that they feel confident they have completed the given task.
- Ease rating - a score given to a task based on how easy or difficult it was for the participants.
- Content quality rating - a score given to any content related to a task (including written content and media)
- Pre-task confidence rating - the level of confidence they had in the task goal before completing the task.
- Post-task confidence rating - the level of confidence they now have in the task goal after completing the task.
- Improvement to confidence - the difference between pre and post task confidence ratings.
At the end of each session, we asked users to provide scores on their experience of the website, as well as to complete a System Usability Scale questionnaire.
This data was then synthesised to arm CFA with insights into user behaviours, needs, and pain points, to identify specific areas for improvement across the site into the future.
Benchmarking guidelines
We also provided CFA with key guidance on how benchmarking activities should be planned and conducted as part of an ongoing process of assessment and optimisation. We outlined best practices for quantitative and qualitative benchmarking, including the steps to maintain an effective benchmarking strategy over time.
Having these clear guidelines will help CFA confidently assess the impact of content, design and technical optimisation efforts to the website over time.
Impact of the work
The real value of benchmarking isn’t just knowing if performance has improved. Its value is in identifying opportunities for ongoing performance enhancement.
By establishing an ongoing benchmarking process, CFA’s team will be able to track progress, identify new areas for optimisation, and measure the effectiveness of updates.
This work has provided a roadmap for ensuring CFA’s ongoing benchmarking activities are effective over the long-term, so the CFA website can continue to evolve in response to user needs, delivering a superior experience that is both measurable and sustainable.
Let's work together
Get in touch