evaluating the experience of subscribing to a streaming service and its premium add-on

TV Streaming Service / UNDER NDA

Role: Lead UX Researcher

Reading Time: 3 mins

 
SCROLL DOWN
TO READ MORE
 

THE Challenge

IDENTIFY UNMET NEEDS AND PAINS OF USERS SUBSCRIBING TO THEIR PLATFORM and purchasing premium add-on

A streaming service asked us to evaluate the user experience for both new and existing subscribers, particularly those upgrading to an ad-free, high-quality viewing experience. They were introducing a new premium add-on package with benefits like 4K streaming and wanted to uncover any unmet needs or pain points in the current upgrade process. Our research aimed to understand users’ perceived value of the premium add-on, give insights and recommendations to improve the overall subscription experience which aims to boost conversions, and support the integration of their new package.

The result

facilitated analysis workshop

I facilitated a Preliminary analysis workshop with stakeholders to discuss their observations, prioritise the findings and discuss the challenges of the testing after viewing the sessions in-person before writing and presenting a report.

Presented insights AND RECOMMENDATIONS to WIDER TEAM

Remotely presented prioritised findings and design recommendations to 20+ people from multiple cross-functional teams organisation wide and delivered a written report of all findings, such as a barrier that prevented users from purchasing the add-on subscription after browsing the streaming services content.

The full story

Most of the client’s product teams are siloed, user research is often conducted in isolation from each other, and insights aren’t shared effectively organisation-wide. This has resulted in an inconsistent experience across different devices, especially during subscribing to the platform.

When deciding the approach of usability testing during alignment meetings, the key stakeholders and I weighed up the benefits of testing the live journey. We decided to test the live journeys to avoid users being limited to exploring a prototype’s happy path, which would have restricted our findings.

 

Tailoring the discussion guide

Before crafting the testing script, in addition to alignment meetings with key stakeholders, I conducted a short stakeholder workshop to identify the client’s assumptions, questions and any known issues. They annotated these onto a Miro board for each screen of the onboarding journey for each device. This allowed me to tailor the discussion guide to the client's needs and explore key areas.

A shared working board with the client was created in Miro for each project. I created the discussion guide in the shared working board, this allowed the client to feedback on each iteration of the discussion guide before finalising ready for testing.

 

User testing the live journeys

I conducted in-person usability tests with 15 participants, evaluating the live journeys across four devices such as Amazon Firestick, Xbox and a mobile. Over three days, multiple stakeholders from various product teams visited our London lab to observe the sessions firsthand or watched remotely via livestream. For some it was the first time they were working together directly and created buy-in to the project and confidence in its outputs.

Our live stream also allowed observers in-person and remotely to send me questions if they wished to dig a little deeper or ask additional questions.

Full session notes were documented in a closed Miro board by a supporting consultant, while observers had access to a shared working board with an identical feedback section. This allowed them to note key observations and questions, which could be referenced during daily debriefs and future analysis.

 

ANAlysing the data

After the sessions were complete, I affinity mapped the findings in Miro and drew my conclusions of the study. I facilitated a pre-planned preliminary analysis workshop with stakeholders to discuss their observations, and the challenges faced during testing to align with them and support the creation of an insight report.

Unfortunately, the debrief was delayed a week due to stakeholder commitments, which impacted the clarity and effectiveness of team members sharing their thoughts and observations.

Setting expectations and preparing stakeholders further in advance will improve engagement and the overall quality of the discussions.

DELIVERED

Presented INSIGHT report to stakeholders

I delivered a one-hour presentation giving prioritised insights and design recommendations to an audience of over 20 stakeholders from various product teams across the organisation, including a Q&A section to clarify any findings and receive feedback before delivering the report. I then sent the team an insight and recommendation report along with any supporting material such as video highlights.

Unfortunately, the agency hasn’t yet gathered any quantifiable metrics on the impact of this study.

The following insights aimed to help the product teams improve the experience for users and increase conversion of subscriptions in particular the new add-on.

Some of the findings that we discovered were:

 
  • As a result of the decision to test the live journeys across different devices rather than prototypes, we were able to identify a critical error that stopped users from being able to purchase a subscription without a hard reboot and reinstalling the app on their games console.

  • Copy on the subscription offers page on mobile confused participants and they didn’t understand what content and benefits were included in the packages and why some offers that they perceived to include less content were more expensive than others.

  • Most participants were surprised that features like 1080p streaming weren’t included in the base subscription. They felt the premium plan wasn’t good value for money, highlighting the need to enhance its benefits.

 

Key takeaways

Green-light login and payment methods before testing

Prior to testing the client and I created spent half a day creating dummy credentials and payment information for each scenario for users to either subscribe to the platform or upgrade their subscription. Unfortunately, these were sometimes blocked by third-party services due to suspected suspicious activity. This created additional friction for some users which affected testing and was added as a key consideration during the report.

 

Live journeys come with updates

I raised the concern of live updates during the stakeholder workshop and I suggested a pause to any updates of the specific journeys to avoid inconsistent findings, which the client agreed to. Unfortunately, during the 3 days of testing the journeys received multiple updates which affected questioning and users navigating to tasks which required flexibility and quick decision-making during moderation to deliver balanced and consistent findings.