Helping choose a course

OVERVIEW

Background:

  • Open Universities Australia is a not-for-profit, and an online marketplace for higher education.

  • There is an online quiz on the website (named “Help Me Choose”), that aims to help users find a suitable course based on their preferences.

  • The feature is the highest lead generator on the website

  • Opportunities were identified to improve the experience

Opportunity:

  • For users: an improved experience translates to a positive impression of the Open Universities brand, as well as receiving more accurate course matches.

  • For the business: improvements to the experience would lead to an increase in completion, lead submission, and conversion of new known prospects.

My role:

  • Collaborating within a product team, and leading the end-to-end UX process.

  • UX activities: discovery research, user interviews, heuristic review, moderated and unmoderated usability testing, UI design, experiment analysis.


🔎 DISCOVERY

What’s the current situation?

Previous design

Initial review

To understanding the current flow, I mapped out the existing screens and noted the logic and impact of each question. Alongside comments and suggestions based on an initial heuristic evaluation and past research.

Previous screen flow

User interviews

I ran a set of user interviews with prospects and newly enrolled students to understand their experience and perception of the quiz.

💡 Key learnings

  • The tool is most relevant for users who only have a vague idea of what they want to study.

  • The experience needed to be helpful in guiding new prospects, while keeping the completion quick and easy.

Highlighted user insights


 

Competitor Analysis

I reviewed other online quizzes and online application tools from various industries.

This helped to see existing design best practices, review common patterns, and explore design inspiration.

Snippet of competitor review


✏️ DESIGN & TEST

How might we provide users with helpful guidance in their exploration journey?

Content testing

Working with the Product Manager and Senior Content Designer, we drafted a new set of questions that would relate to more search filters and hence result in more targeted results. It came up to being over twice as many questions (15) as the current experience (7). I set up unmoderated and moderated testing on the flow, amount, and comprehension of the content.

Snippet of tested content

💡 Key learnings:

  • Some terminology were not well understood, and needed more explanations

  • There were concerns if some answers would eliminate suitable courses

  • Overall, the quiz was seen as helpful. Users were happy to answer more questions to get more accurate results

I wonder if I select one thing or another thing, does it filter out certain courses that might be suitable?
— Concerns on the filtering impact
I think in this space you need to ask those questions. I don’t think there was any question that shouldn’t have been there and I don’t think anything was too personal
— Positive reaction to the new questions
 

Design stage: 30%

We followed a 30/60/90 design feedback structure.

I shared my Figma designs at multiple check points with the team to gain feedback and a shared understanding of the project progression.

30% design - concepts

At the 30% stage, I proposed the general hierarchy, and shared a number of lo-fi concepts for team discussions. Solutions include:

  • Include “back” and “skip” options, to give users the flexibility to navigate as they choose.

  • Ensure that contact options to student advisors are easily available.

  • Clearly differentiate between single selection and multiple selection answers, by using common UI patterns of radio buttons and check boxes.

  • Add helpful copy to explain “why are we asking?” for transparency.

 

Design stage: 60%

I had taken the team’s feedback on board, and using the brand style guide, I ideated on a more refined visual design. I explored concepts of the quiz being a standalone page or a pop-up modal, which was then taken to the next round of usability testing.

Usability testing

The next round of usability testing was conducted to:

  1. Validate updated content

  2. Validate the visual design direction

  3. Capture the difference between a full-page vs pop-up modal design experience

💡 Key learnings:

  • Interestingly, more users felt the quiz took more time on desktop compared to mobile

  • Positive responses on the look and feel of the design

  • No difference were noticed between a full-page vs pop-up

This feels quite long, before I see what courses there are. It’s a lot of questions.
— Reaction to the quiz length
I don’t need to go through like a hundred or a thousand degrees to see the 22 that is suited to me. You guys have done that for me already.
— Expectation at the end results

Design stage: 90%

Based on user feedback, we needed to strike a balance in the amount of questions that are helpful but also not to feel too long. We decided on going from 15 to 12 questions. We also decided on having the quiz on a full page, as it was similar to the current experience and there were no clear user benefits of it being displayed in a modal.


🚀 EXPERIMENT & LAUNCH

We launched the new design as an A/B test to 50/50 traffic.

  • Primary metric = lead submissions at the end of the quiz.

  • Secondary metrics = quiz completion, click through to product pages, bounce rate, and exit rates.

To our surprise, it did NOT go well straight away. Leads and completion rates were down. But we were able to get learnings and adapt quickly. Due to the high traffic, we could make frequent changes and get the data to drive our decisions.

In 1 month, we launched (and re-launched) the quiz 7 times. Each time as an A/B test, which we closely analysed.

We had daily check-ins as a product team to analyse the results and make changes to turn things around.

💡 Key learnings:

  • Identifying which pages had the highest drop-offs gave us a good focus to see where the key issues are.

  • Removing the amount of options reduces drop-off, as it reduces the cognitive load of a user.

  • Having too few results in the end can cause users to abandon the quiz. Assuming users gets less value when presented with only a few courses.

  • To really understand what factors impacted the results, we had to roll back some of the changes. Any new ideas for improvement were noted as assumptions to be tested in follow up experiments.

  • It was through user interviews that we had the final a-ha moment, where we learnt how some users are very indecisive (many prospects don’t even know what study areas they want to do!). In our last test, we added the selection option to answer “Not sure yet”, which sharply increased completion rate.


📈 RESULTS

Data, learnings, and next steps

Positive uplift in lead submissions and completion rate in the last A/B test gave us the confidence to completely switch to the new experience.

  • Increase in completion rate

  • 16% increase in new known prospects

  • Trending uplift in enrolments

  • Improvements in the quality of conversations between prospects and student advisors

💡 Learnings

  • A key learning from the project was the importance of having a clear launch and rollback plan to mitigate risk.

  • Qualitative insights through customer interviews (Continuous Discovery habit) is equally important in addition to the quantitative data

  • Psychological safety is critical to have in a team.

    • During the experiments, we had daily team chats where every single person could express their thoughts and have productive discussions. Most importantly, we still had fun and enjoy the process - even when numbers were down!

  • It was important to bring the team as well as external stakeholders along our journey.

    • I presented the project and user interview videos at an all-staff showcase. Literally hearing the voice of the customer created excitement and shared understanding.

👍 Feedback regarding the approach and research

"Video snippets really bring the voice of the customer to the table"

"That was really great to hear and awesomely approached! Love what you guys are doing!"

"So good seeing the customer interviews… thanks for sharing this."

"Living for this presentation format Anna - hearing it ‘straight from the horses mouth’ makes all the difference!"

Next steps

  • The project enabled us to do an experiment immediately afterwards.

  • Further optimisation to be continuously explored and prioritised.

  • The redesigned experience was featured in marketing material, driving additional traffic through social advertising.