Justice Partnership and Innovation Program Evaluation, Final Report
Appendix C: Sampling Strategy
Successful Applicants Sampling Strategy
This sampling frame was drawn from administrative data for fiscal years 2007-08 to 2010-11.
In total, 30 successful organizations (some of which were successful more than once) were sampled. The evaluators worked with the data to identify various applicant categories and selected a representative number of applicants in each to ensure a minimum number of respondents for each category. These categories included:
- Part of file review sample (approximately half the sample); and
- Category of application (e.g., Nunavut, general contribution, initiatives related to missing or murdered Aboriginal women, PLEI (core), and Named Grants.
Other characteristics which were considered but not used included:
- Region. Random sampling should have addressed regional distribution and we did not have population level information to compare against for bias.
- Type of organization applying for the funding. There was insufficient information to classify organizations by type.
- Amount of funding.This was not factored in since it was already indirectly taken into account through the selection of respondents in funding categories, and the random selection should avoid systematic biases.
The list of survey respondents (successful applicants) was drawn from administrative databases maintained by Justice Canada. The sampling for successful applicants occurred in two steps. First, a random selection of 15 successful applicants included in the file review was conducted, ensuring they represented different types of applications. Then, 15 successful applicants who were not included in the file review were selected. First, a quota for the selection of representation in each strata was established (e.g., number of respondents by category of application). Then, each unique applicant in each category for the fiscal years 2007-08 to 2010-11 was assigned a number. A random-number-generator program was used to select which numbers (organizations) would be part of the sample. In some cases where the number of successful applicants was small (e.g., less than three), all respondents in the group were selected.
In most cases, backups for the selected categories were also chosen. The only exception to this was when there were no backups (e.g., population had to be selected since there were fewer than three organizations). This way, if the contact information in the files was no longer valid or respondents refused to participate, backups were already identified and respondents were already randomly selected.
Unsuccessful Applicants Sampling Strategy
Although there were names going back as far as 2007-08 for unsuccessful applicants, we specifically sampled from the last two fiscal years (2009-10/2010-11) to avoid problems with follow-up (e.g., staff turnover) and recall issues for applicants who have not had more recent application experience.
The list of survey participants was drawn from administrative databases maintained by Justice Canada. The sampling for unsuccessful applicants proceeded as follows. First, each unique applicant was assigned a number. A random-number-generator program was used to select which numbers (organizations) would be part of the sample.
Once the sampling frame was set, Justice Canada sent each organization – those in the frame and backups – an official letter explaining the purpose of the survey. The letter:
- stated that the consultant team was conducting the survey and for whom the information was being collected (i.e. Justice Canada);
- stated why the information was being collected;
- clarified that individual views or statements would not be made available to any Justice Canada personnel except in a statistical summary or as anonymous comments;
- informed the individual that participation in the evaluation was voluntary; and
- invited them to participate.
An email was then sent to organizations in the sampling frame inviting them to participate in the survey and indicating that the evaluators would be writing or calling to set up a convenient time for the interview. The first two telephone surveys were conducted by the survey line-of-evidence coordinator as pre-tests. A few changes were made to the final instrument to ensure questions were clear and probes were relevant.
To maximize response rate to the survey, two reminder emails were sent and two telephone calls were made to reach respondents.
Survey Response Rates
The response rate (and how it was calculated) for the survey is presented below in Tables 1-3.
Table 1: Survey Outcomes: Successful Applicant Survey Response Rate
| Survey Outcome | Number | Percentage |
|---|---|---|
| Invalid contact | 2 | |
| Contact away during time of study | 4 | |
| Refused | ||
| Completed survey | 18 | |
| Total participants contacted | 25 | 72% |
| Response Rate Calculation | Number | Percentage |
|---|---|---|
| Total contacts (contacted - invalid contacts) | 23 | |
| Cooperative Contacts (completed + away) | 22 | |
| Response rate (cooperative contacts/total contacts) | 96% |
Table 2: Survey Outcomes: Unsuccessful Applicant Survey Response Rate
| Survey Outcome | Number | Percentage |
|---|---|---|
| Invalid contact | 8 | |
| Contact away during time of study | 1 | |
| Refused | 2 | |
| Completed survey | 8 | |
| Total participants contacted | 20 | 40% |
| Response Rate Calculation | Number | Percentage |
|---|---|---|
| Total contacts (contacted - invalid contacts) | 12 | |
| Cooperative Contacts (completed + away) | 9 | |
| Response rate (cooperative contacts/total contacts) | 75% |
Table 3: Survey Outcomes: Overall Survey Response Rate
| Survey Outcome | Number | Percentage |
|---|---|---|
| Invalid contact | 10 | |
| Contact away during time of study | 5 | |
| Refused | 2 | |
| Completed survey | 26 | |
| Total participants contacted | 45 | 58% |
| Response Rate Calculation | Number | Percentage |
|---|---|---|
| Total contacts (contacted - invalid contacts) | 35 | |
| Cooperative Contacts (completed + away) | 31 | |
| Response rate (cooperative contacts/total contacts) | 89% |
All survey data were analyzed from a Statistical Package for Social Sciences data file, which included variable names and value labels. This file contained the verbatim responses to all open-ended questions. The data was analyzed using frequency tables and cross-tabulations based on application status (i.e., successful vs. unsuccessful) and compared using chi-squares.
- Date modified: