Case Study
Fixing Diagnostics for Complete Maths TUTOR
Data Analysis • User Research • Iterative Design • Cross-team Collaboration
Using data analysis to identify and fix usability issues in a recent release to our maths education platform, to enhance the functionalities impact with pupils.
Context
At Complete Mathematics we released a landmark new ‘Pupil Diagnostics’ feature to our mathematics learning platform, TUTOR. This asked pupils a series of maths questions — dynamically generated and bespoke to each pupil — which once complete delivered a personalised learning pathway building upon the mathematical knowledge they already had secure.
Data Analysis
With the release live, we immediately began tracking and investigating the usage data coming in. I was looking for developing patterns and trends in the data that could either validate our expectations for the functionality, or could provide evidence of any usability issues or unexpected behaviour.
Some important observations started to leap out from this analysis, in particular from the visualisation of the distribution of Questions Taken by pupils, split by ‘complete’ and ‘incomplete’ diagnostics.
Analysis Insights
1: Firstly, we find spikes in the ‘incomplete’ diagnostics count at regular questions-taken intervals — highest at 0, and reducing each time.
What does this mean? These spikes indicate large user drop-off (churn) at the start of the diagnostics, and at predictable intervals throughout, aspects that lead to a lower overall completion rate and worse user outcomes.
2: Secondly, the distribution of 'completed’ diagnostics remains high much further along the questions-taken axis than expected.
What does this mean? This spread shows it is taking pupils many more questions than expected to complete their diagnostics, and so to receive their recommended learning pathway — leading to higher churn and limiting the impact of the functionality overall.
Insight Investigation:
Drop Off Spikes
We immediately noted that the interval of the drop-off spikes matched the built-in frequency of checkpoint screens between question sets in the diagnostic flow.
Investigating further by observing session recordings via Microsoft Clarity, it became clear that user drop-off was facilitated — in fact, promoted — by the prominence of a ‘I’ll do this later’ button throughout the whole flow, but in particular on the initial screen and checkpoint screens where the button was active whilst other page content loaded in as part of an animation.
This animation was slow, so for pupil users impatient to progress the only action available to them was to click the active button, exiting the diagnostics process, leaving it incomplete.
Insight Investigation:
Completion Spread
We needed to validate whether the internal mechanics and logic of Diagnostics were working as intended, so we mapped out the data of real users journeys and tested against it.
Through this we identified some aspects of the algorithm in the live build that needed modifying, and found some additional opportunities to improve the logic for intelligent question selection and learning path recommendation, shortening the process overall.
Design Iteration
We moved quickly to design and test solutions to these issues, and once validated pushed the fixes to the live app.
Button placement & behaviour: We restructured the visual hierarchy of the diagnostic screens and added an intermediate state for the ‘I’ll do this later’ button. This lowered the priority of that action, particularly whilst other elements were still loading. By better identifying the key content and actions for users, we simplified the main user flow for pupils to reduce churn, having the biggest impact on the initial and checkpoint pages (spikes).
Loading behaviour: We fixed the loading order and animation speed for page elements to reduce the overall time it takes for a pupil to complete the diagnostic flow, to further reduce churn.
Question selection algorithm: In updating the diagnostics logic and question selection algorithm we reduced the average questions-per-completion, improving the user experience and overall completion rate. In doing so, giving more pupils access to their bespoke learning pathways, improving user outcomes.
Next Steps, and Takeaways
There is still more to learn about how this new functionality performs in the live school setting. We need to continue tracking how diagnostics is performing and be ready to allocate resources to react to any issues or opportunities.
These changes were an improvement, but a more substantial update should be made to the content loading & animations in diagnostics to better match established web standards for chat functionality. We could not do this here due to time constraints, but is a worthwhile investment for long-term user experience improvements.
This process reinforced how important it is to plan ahead for data collection and analysis so that you can begin tracking as soon as functionality is released. We were alerted to these specific usability issues, and were able to respond and iterate rapidly to them, because of the work done early during the design and development phase before release.
As a small team, we do not have a dedicated user researcher or data analyst. Unfortunately this limits the amount of this work that we can do. Having this additional data analysis resource in the team, working consistently and across the whole system would be a huge benefit to the company, in particular in the development and prioritisation of the product roadmap.
Summary
We released new functionality to provide pupils with a personalised learning pathway based on what mathematics they did and didn’t already know, however early usage data suggested that many pupils weren’t benefitting, with high churn and a lower than expected completion rate.
By analysing the usage data, we were able to identify and fix a number of key usability issues to increase the completion rate of the diagnostic flow, leading to more pupils receiving their learning recommendations, and so getting greater educational value from their use of the Complete Maths TUTOR platform.