From Experiment to Results

Experimentation Works
4 min readJan 30, 2019

--

Lessons from a Department of Canadian Heritage Micro-Grant Experiment

This is the second in a series detailing the results of the Department of Canadian Heritage’s youth micro-grant experiment. While this blog describes our final results, for the first set of results please go here.

Throughout 2017 and 2018, the Department of Canadian Heritage decided to try something different: what would happen if we adapted the Paul Yuzyk Lifetime Achievement Award for Multiculturalism as a youth micro-grant? Could we do better than the first federal youth micro-grant, which had received 7 applications in total? We had no promotional budget to work with, but with a creative use of stakeholder lists and a simple but effective outreach effort, we ended up with 70 applications! From this, we funded thirty-three separate youth-led projects totaling $30,000.

Experiment — Background and Context

The Paul Yuzyk Award for Multiculturalism began in 2009 as a lifetime contribution award and was re-purposed as a youth micro-grant ‘experiment’ in 2018 to support the Government’s youth engagement efforts. Young Canadians were invited to apply for grants of $250, $500 or $1,000 to support community projects that advanced diversity and inclusion.

In July and August, we gathered feedback on the Initiative’s impact and potential for scalability through an online survey of successful applicants and follow-up phone interviews to gather in-depth qualitative data. We also tried to trace the social media footprint of this year’s initiative by tracking the hashtags #YouthandDiversity and #JeunesEtDiversite, which the participants were required to use.

Lessons Learned

It is difficult to determine the impact of initiatives funded

We chose to focus our efforts on measuring the level of youth engagement with the Initiative. Our baseline was the traditional Paul Yuzyk Award for Multiculturalism, which provided three (usually) mid or late-career community members $10,000 each. Our vision for the experiment was that the micro-grant format would engage youth. If we built it, would they come? How would we validate such a hypothesis?

Despite the difficulties of comparing across two different versions of the initiative, we proved our hypothesis correct. Our micro-grant received 70 applications from youth in all the provinces and one territory. This was far above our initial expectations in terms of interest and geographic coverage. The diversity of projects was also beyond what we expected, as was the number of participants: 33 funded projects reached hundreds of young people across a spectrum of activities; 16 projects received no other sources of funding (i.e, would not have existed otherwise). In terms of youth engagement, the micro-grant format was a natural fit for engaging youth in all parts of Canada.

However, when it came to measuring impact, we weren’t able to accurately assess the impact of these projects and their exact contribution to fighting racism and discrimination. Complex problems require sophisticated data to prove causal relationships. This would require at a minimum baseline data and longitudinal interactions to show change over time, which was obviously impossible with a small one-time project with limited reporting details.

Talking to our recipients was well worth the investment

Our post-program surveys and follow-up interviews were essential to understanding the end-user experience. For example, 95% of respondents in our post-program survey (21 of 22) strongly agreed that they would recommend the Initiative to a friend, and 86% of respondents (20 of 22) were very satisfied with the Initiative overall. Thus, in terms of client satisfaction, the micro-grant proved a definite success. However, hearing from our recipients also outlined areas for improvement.

Behavioural analysis: the key to good reporting requirements

As a condition of funding, we required our applicants to complete the post-program survey and post a photo to social media. We were surprised that taking a ‘mandatory’ approach did not result in 100% compliance.

  • For the survey, our return rate was 25 of 33, or 76%. This is nearly 3 times higher than the normal completion rate for such surveys (which tends to hover between 0 and 30%), but still lower than expected, given we paid out 100% of funds.
  • For the social media posts, our return rate was roughly 45%. This points to the difficulties in verifying project completion when using social media as the sole reporting platform.

Clearly, making something mandatory is not enough. And there are opportunities to study methods of increasing compliance with the post-program reporting requirements, including increasing the number of ways to report. Both of these areas also need to be addressed before “scaling up” this experiment with larger budgets.

Conclusion

By considering our lessons learned, and the areas for improvement, several options for experiments present themselves for future iterations:

  • Of the 200+ individuals who had downloaded our application form, only 70 completed it. Could we decrease the drop-off rate with a more youth-friendly application form? Randomly assigning two different versions of the form (current and improved) and measuring their comparative return-rate would produce data to show the extent to which the application process itself is a turn-on or turn-off.
  • How do we increase compliance to reporting requirements, on-line posting or otherwise? One approach could be to offer applicants the choice of reporting back by either emailing a photo or posting to social media, and then comparing the response rates relative to this year’s version of the Initiative.

In conclusion, the youth micro-grant experiment showed us that young people are willing to engage with the Government on important and complex issues, even on small-scale projects. However, we need to address several administrative hurdles in order to make such engagements more youth-friendly, cost-effective and transparent in terms of results.

Post by Department of Canadian Heritage’s Paul Yuzyk Youth Initiative for Multiculturalism EW Team: Chantelle Komm, Maria Belen, Sai Prithvi Mynampati, Johnny El-Alam

Article également disponible en français ici: https://medium.com/@exp_oeuvre

--

--