Home

Media Super Content Comprehension Testing

We conducted content comprehension testing of key pages on the Media Super website to identify opportunities to improve the user experience.

The brief

Media Super was looking to improve its pension-related content as part of a broader push to educate members about the benefits of withdrawing their super as a pension income stream when they reached retirement age. To ensure their site content was resonating with the target audience, we recommended undertaking a content comprehension testing exercise to gather deeper feedback on the existing content and identify areas for improvement.

What we did

We ran a series of content testing exercises designed to assess users’ comprehension of the existing pension content. These exercises looked at the comprehensibility of both broader pension concepts, as well as the specific language that was being used.

To ensure participant feedback was focused on the content only, we presented the existing site content in a plain document format. Heading structures and hyperlinks were retained, but all extraneous design features and imagery were removed, to make it easier for the user to focus their assessment purely on the words in front of them.

Testing was conducted with users, aged 51–56. Tests were conducted remotely via video call in line with government restrictions at the time.

How we did it

We employed three testing methods throughout each session

The highlighter test

The first was the Highlighter Test. In this activity, users were asked to read a page of text in their own time, and then re-read the text, highlighting anything they did not understand in red. Once this had been completed, users were asked to re-read the text again, and highlight any passages they particularly liked in green. As they moved through the document, we asked them questions about the passages they had selected to further interrogate their pain points.

By looking at the results of this activity and comparing each user’s selections, we were able to identify common areas of confusion and patterns in the way users were relating to and interpreting the text. We were also able to identify any key terms or phrases that were unfamiliar to users, or which may have been interpreted incorrectly. This provided us with a clear list of opportunity areas where the content could be improved to aid user understanding.

The recall test

The second activity asked users to read another page of content in their own time. Once they were finished, we presented them with a blank piece of paper, and asked them to answer a series of questions based on the content they had just read. The participants’ responses to these questions provided us with an indication of both how well they had understood the text, and how well they were able to remember key details.

From these results, we were able to identify where the content was performing well in its explanation of concepts and its presentation of facts, and where key pieces of information might be getting lost.

Card sorting

In the final activity, users were presented with a list of key terms they had encountered in the previous tasks, and asked to match these terms to a series of definitions.

This task assessed the participants’ comprehension of key concepts in the text, and allowed us to identify any terminology that was not well understood, or that was getting mixed up.

Additional research

To gain further insight into the performance of the existing content, we ran readability tests against the Media Super website using the Flesch Reading Ease test and the Flesch-Kincaid Reading Grade scale.

The Flesch Reading Ease test analyses the number of sentences, words, and syllables in a given piece of content and gives it a score on a scale of 1–100. Higher scores indicate higher level of readability, and generally speaking, a score of 60 or higher is considered optimal.

The Flesch-Kincaid Reading Grade scale uses a similar methodology to assign a US grade level to a given piece of content. The grade level can broadly be interpreted as the number of years of education required to understand the content, with scores of 9 or below considered optimal.

Analysing the scores of each content page within the Retirement section allowed us to gain an empirical understanding of content readability, and also gave us a sense of which pages should be prioritised for rewriting.

Distilling insights into actionable recommendations

Using the feedback gained from the test sessions and the outcomes of the readability audit, we created a series of actionable recommendations for Media Super, to help guide their future content creation efforts.

Some of these insights included:

The impact

The insights gained from this testing activity have broad application across the entirety of Media Super’s content, including print content. Using this feedback, we have worked with Media Super to develop a prioritised plan for content rewriting to improve readability of Pension content, and have also been able to integrate these same recommendations into ongoing content creation projects for other site sections.

Let's work together

Get in touch