Health Canada’s PRODigy experiment
An experience in learning-by-doing
In our last blog post, we introduced our problem, research question and general approach forward. Since then, we have put pen to paper in conceptually designing and building our experiment.
Designing the experiment
Our incident reporting process is predominantly through an online reporting form, so it made the most sense to test in this environment. Our goal was to zero-in on a small yet feasible part of the process that we could test and gather some meaningful results.
Our focus was our landing page — could simple changes here have an impact on consumer reporting rates? A landing page is often the first interaction a consumer has with our Program and first impressions are everything. If the page doesn’t meet their needs within seconds, they won’t stick around. A common practice in the private sector to improve this “conversion rate,” (that is, getting people to stick around) is to experiment with something called “A/B testing.” This type of testing employs two or more versions of a page coupled with web analytics to understand user behaviour as they navigate the different versions. The idea is to test which version has the strongest effect on improving your desired outcome. For the purpose of this experiment, we opted to repurpose the same content from the original landing page and test it against a page with changes to the language, ordering of content, and layout.
Carrying out real-time randomized A/B testing is not a common practice in the federal government. Our Treasury Board Secretariat (TBS) team members played an instrumental part in making this idea a reality by holding feasibility discussions with their colleagues and specialists.
With the door now open to A/B testing, the next step was to outline the experimental design which included such elements as a hypothesis, methodology, measurable outcomes, ethical/privacy considerations, and how the results would be used. It was important for us to design an experiment that could be replicable in other venues. To accomplish this, we created a project team composed of:
- Subject matter experts (SME) from our Program;
- IM/IT specialists from our Program as well as our departmental IM/IT Services Directorate
- Advisors from Digital Communications in our departmental Communications Branch; and
- Our Experimentation Works (EW) experts.
Our EW experts provided extensive support by advising us on such points as keeping the experiment simple and small, having clear and measurable metrics, and the importance of a sufficient sample size.
As we planned out the experiment, including designing a behaviourally-informed intervention, we ran into a few early challenges:
- Sample Size: our Web Analytics team provided us with baseline data to use in a sample size calculation. Our calculations showed that we needed to run the experiment for longer than expected. It wasn’t necessarily a game changer, just something that influenced our proposed timelines. When we approached our senior management to describe the scenario, they were supportive of the extended timelines if it meant we were going to get meaningful results.
- Two platforms: our current and live landing page and our incident report form are on two different platforms (Adobe Experience Manager and Healthy Canadians), which means they have different functionalities and data tracking capabilities. Given this separation, we came up with a creative IM/IT solution to make sure we could run particular features and collect the necessary analytics.
Building the experiment
The experiment project plan was completed; and the intervention was drafted based on best practices for web and the Canada.ca Content Style Guide. The intervention landing page was then approved and translated. We then passed the baton over to our technical colleagues to build it.
With any IT enabled project, there are multiple steps and interdependencies, and some internal bureaucracy challenges. Plus, the language needed to communicate with our IT colleagues (programmers) was new to some of us in the Program, which meant information was sometimes lost in translation between the SMEs and the programmers.
We’ve faced multiple technical challenges along the way, some expected and some not, but all requiring creative solutions.
- While we were extremely lucky to have very strong and experienced employees to help in navigating this unfamiliar territory, this involved groups that are housed in different directorates or branches — all working on their respective piece of the puzzle, but occasionally without a broad overview of the entire experiment.
- Furthermore, with some digital services being centralized, it has been a challenge for us to position ourselves as a priority given the other work happening at the same time. It was difficult to anticipate the completion rates given timelines will vary if issues were identified in each of the multiple stages of testing.
Early Lessons Learned
If any innovative project is going to succeed, it needs to survive a handoff from an innovation team/SME to an execution team, particularly in the IM/IT world. We had not anticipated the number of steps needed to put this experiment together nor some of the issues which arose, and in hindsight, whether any of this could have been anticipated.
All the background research we did primed us to think this could be done quickly; however with so many variables at hand (different servers, various departments, etc.), it was difficult to predict.
Suggestions and Takeaways
- In an ideal world, you would know from the outset what skillsets were key to your project team. But in reality you may not find out what and whom you need until you are deeper into the project, and that is normal. Innovation is all about learning by doing, and adapting as you go.
- Regular team meetings throughout the duration of the project can help not only to clarify roles, but also to identify challenges before they become problems. They also foster teamwork, collaboration and sustain optimism and focus!
- Enlisting the help of a user experience (UX) designer may add value by bridging the gap between subject matter experts, programmers and communications. With several departments, specialties and technical languages spoken throughout the project cycle, a UX designer could be valued support in this type of project.
- Continue to build the culture of experimentation in government. After all, “the best data-driven companies don’t just passively store and analyze data; they actively generate actionable data by running experiments” (Jenkins, W, 2014). In order to get better at something, you need to practice.
- Manage expectations and prepare to be flexible. This is an experiment and not a panacea. Luckily we had significant senior management buy-in and support, which gave us the latitude to make various project management decisions such as those around timelines.
There is no success or failure in this journey that isn’t worth learning from. We hope at the end of this, we will be confident knowing whether simple changes to our landing page can affect consumer reporting rates. At the end of this experiment, we can back up our advice with data, rather than assumptions. Our experiment can also inform future innovations that can be tested with other audiences, and finally discover other ideas for experimenting to evaluate our impact.
References:
Jenkins, W (2014). A/B Testing and Benefits of Experimentation Culture. Harvard Business Review. Retrieved November 21, 2018 from https://hbr.org/2014/02/ab-testing-and-the-benefits-of-an-experimentation-culture
Post by Health Canada’s EW Team
Article également disponible en français ici: https://medium.com/@exp_oeuvre