Evaluation of the Youth Justice Initiative Funding Components

3. EVALUATION METHODOLOGY

The study included the methodologies outlined below.

3.1. Document Review

The document review was conducted to inform the development of data collection instruments, as well as to address particular evaluation questions. Documents reviewed included: background information on the funding components; agreements and reporting templates; financial information on planned and actual funding by fiscal year; YJF terms and conditions; and eligibility criteria. Government and departmental documentation (e.g., Speech from the Throne, Departmental Performance Reports [DPR]) was reviewed to assess the components’ relationship to the Department’s strategic objectives and federal priorities.

3.2. Canadian Centre for Justice Statistics Data and Review of Literature

Academic journal articles and youth criminal justice statistics from the Canadian Centre for Justice Statistics (CCJS) were used to compare crime, court services, and custody use over time in order to provide context for the report and respond to certain evaluation questions.

3.3. Key Informant Interviews

Key informants were identified by the EAC and provinces/territories, and interview guides were developed in consultation with the EAC.[9] Prior to key informant interviews with provincial/territorial partners, a consultation was held with volunteer provincial/territorial representatives from four jurisdictions to validate the provincial/territorial interview guide; more specifically, to ensure that questions were clear and relevant and that the guide covered all of the necessary issues. The consultation was conducted as a small group interview and took place by teleconference on August 28, 2009. Following the session, the interview guide was revised based on the input received.

The Department notified key informants of the evaluation and requested their participation. From November 2009 to February 2010, 54 interviews were conducted with 72 interviewees. Several interviews involved more than one key informant. Key informants each addressed one or more of the funding components.

Approximately two-hour telephone interviews were conducted with nine Department personnel and 33 provincial/territorial partners, and approximately one-hour interviews were conducted with eight other federal representatives and 22 YJF recipients.  The interview guides are located in Appendix F.

3.4. Case Studies of Youth Justice Fund Projects

Case studies were conducted about 12 projects funded through the YJF during the fiscal years covered by the evaluation. The sampling strategy for YJF case studies was determined by the EAC and designed to focus on best practices and impact of funding. The breakdown of case study projects by funding stream is Core (8), YJADS (3), and GGD (1).

A review of project files and one to two interviews were conducted for each case study. The interview guide was developed in consultation with the EAC. A total of 13 interviews were conducted with 16 interviewees.  The interview guide is located in Appendix F.

3.5. Survey of Youth Justice Fund Recipients

A mail survey was conducted with YJF recipients from November 19, 2009 to January 5, 2010. A total of 54 YJF recipients completed the survey out of a sample of 115, for a response rate of approximately 47%. Most surveys (93%) were completed in English; the balance in French. Almost all respondents (98%) completed the survey by mail, with the remainder (2%) completing the survey by phone during follow-up calling. The survey questionnaire, developed in consultation with the EAC, is located in Appendix G, and the sampling approach is shown in Figure 1, Appendix B. 

Survey data were analyzed using the Statistical Package for the Social Sciences program. A comparison of Grants and Contributions Information Management System (GCIMS) and survey data found that survey respondents were generally representative of projects funded under the YJF for the fiscal years covered by the evaluation in terms of funding stream, component, project type, jurisdiction, and amount approved.[10] Where not otherwise specified throughout the remainder of the report, percentages from the survey are calculated out of the full base (n=54).

3.6. File Review

YJSFP and IRCS files for all jurisdictions were reviewed using file review templates developed in consultation with the EAC. The EAC selected a total of 19 YJF files for review. Documents reviewed included annual plans, reports, and claims, as well as Face Sheets submitted for each IRCS sentence.[11] Data are presented in counts instead of percentages because of the small number of files reviewed.  The file review templates are located in Appendix H.

3.7. Analysis of Youth Justice Fund Grants and Contributions Information Management System Data

Data from the GCIMS database were analyzed to enable an assessment of some of the basic characteristics of all projects funded over the fiscal years of the evaluation. The total number of applications as shown in GCIMS includes only those projects that submitted complete proposals. Projects that were eliminated at earlier stages in the vetting process (e.g., upon initial call to the YJF or submission of a letter of intent) are not represented.

3.8. Methodological Challenges

The focus of the present study is on the effectiveness of the funding components in achieving their stated outcomes. However, there was limited outcome information available to the evaluation. This was an issue particularly for the YJSFP and IRCS Program, as provincial/territorial reporting on programs and services did not capture information that would facilitate the assessment of outcomes. The amount and types of information submitted varied substantially by jurisdiction, leading to difficulty extracting comparable data. For the YJF, limitations were related to GCIMS not capturing project outcome information, as well as variation in reporting across projects prior to the implementation of standard forms. Consequently, outcome information is based heavily, though not exclusively, on qualitative data from key informant interviews, particularly for certain evaluation issues.

A further challenge for the evaluation was separating the impacts of the funding components from the effects of the YCJA. Because the YCJA shares expected outcomes with the funding components, it was not always possible to differentiate the impacts of the funding components from the legislation they support, particularly in the case of the YJSFP.