Aboriginal Affairs Portfolio Evaluation

3. Methodology

The AAP evaluation made use of multiple lines of evidence including: a literature review, a document review, a review of iCase data, a survey of legal counsel, interviews, a file review, and case studies.

The evaluation matrix (which links the evaluation questions, indicators, and lines of evidence and the data collection instruments were developed with the input of the Evaluation Working Group. The evaluation matrix is included in Appendix A and the data collection instruments in Appendix B.

Each of the evaluation methods is described more fully below. This section also includes a brief discussion of methodological challenges.

3.1. Literature Review

Major developments (e.g., trends in scope, nature and complexity of issues) that have occurred since the 1970s in the areas of Aboriginal law, Aboriginal legal policy and northern development legal issues in Canada were explored to inform the evaluation through critical information concerning historical context, and to contribute to the development of case study options for the evaluation.

3.2. AAP Document Review

The purpose of the document review was to both inform the development of data collection instruments and to address the majority of the evaluation questions. The document review contributed to a better understanding of how the AAP was administered, managed and monitored over the course of the evaluation period. Key AAP files, reports and documents were reviewed for the purpose of exploring the AAP’s contextual, managerial and operational frameworks and to gain insight into AAP’s mandate and business processes.

Documents reviewed included administrative and publically-available information. The document review included financial data, business plans, Reports on Plans and Priorities, Departmental Performance Reports, information on training, Justice Client Feedback Survey results, Public Service Employee Survey results, Budget Speeches and Speeches from the Throne.

3.3. Review of iCase Data

iCase is an information management tool that is used by the Department of Justice Canada for case management, timekeeping and billing, document management and reporting. The iCase data were used to examine trends over time in the demand for AAP legal services, level of legal risk and complexity, and level of counsel assigned to case files.

3.4. Legal Counsel Survey

A web-based survey was used to gather information about the performance of the AAP from the perspectives of legal counsel across the Portfolio. The survey was online for nine weeks and included 38 questions.

Table 1 provides a profile of the legal counsel from across the AAP who completed the online survey. In total, 145 out of 296 potential legal counsel respondents completed the online survey (representing a survey response rate of 49%).

Table 1: Profile of AAP Legal Counsel Survey Respondents
  Number Distributed Number of Completions Response Rate
ADAG 3 2 66.7%
ALC 14 11 78.6%
Atlantic Regional Office 9 2 22.2%
British Columbia Regional Office 43 14 32.6%
Aboriginal Affairs and Northern Development Canada Legal Services Unit 88 44 50.0%
Northern Regional Office 10 7 70.0%
Ontario Regional Office 26 23 88.5%
Prairie Regional Office 81 29 35.8%
Quebec Regional Office 22 13 59.1%
Total 296 145 49.0%

3.5. Key Informant Interviews

Key informant interviews were conducted to gather in-depth information about the performance of the AAP from the perspectives of various individuals associated with the Portfolio. Interview guides were tailored to each key informant group and were developed in consultation with the Evaluation Working Group. A total of 58 interviews were completed representing Justice officials (n=6), clients (n=18), AAP Management/Senior Counsel (n=17), AAP Legal Counsel/ other AAP professionals (n=15), and external partners/stakeholders (n=2).

3.6. File Review

Fifty-one legal files were selected for review to represent the range of services provided by the AAP. The review was conducted to allow for more in-depth understanding of the life of a file, the types of requests made for AAP services, as well as associated complexities. This method allowed the evaluation to explore the extent to which the information obtained from key informants about how the AAP conducted its work was in evidence in the files. The case studies were chosen from the 51 files reviewed. Though preference was given to legal files where the work was completed during the five-year evaluation period under study (i.e., ‘closed’), drawn from AANDC and AAP’s other client departments, several files that were closed beyond of the five-year evaluation window were also included.

The selection of files for review included litigation cases of various risk and complexity levels. Advisory files were selected to include a range of requests and topics, including assertion of rights and title, environmental agreements, and duty to consult. The sample of files was chosen with the input of the Evaluation Working Group and was considered to provide a good selection of the broad spectrum of the Portfolio work. As the files were not chosen randomly, and as the sample is not large, the file review sample cannot be construed as being representative. Rather, the file review was intended to be illustrative of the AAP’s approach to its work.

In order to protect confidential information and solicitor-client privilege, the files were reviewed by Department of Justice officials. The file review data collection template was used to ensure comparable information was collected.

Table 2 presents the breakdown of the files that were reviewed by region and service type. Court fora included Supreme Court of Canada, Federal Court, Federal Court of Appeal, British Columbia Court of Appeal, Supreme Court of British Columbia, Alberta’s Court of Queen’s Bench, Provincial Superior Trial and Appeal Divisions, as well as Tribunals.

Table 2: AAP Legal File Review Breakdown
AAP Unit Number of Advisory Files Number of Litigation Files Number of Aboriginal Legal Policy (General) Files Total
BC Regional Office (Aboriginal Law Section; Business and Regulatory) 5 3 - 8
Northern 4 4 - 8
Prairies – Alberta 0 6 - 6
Prairies – Saskatchewan 3 6 - 9
Prairies – Manitoba 3 4 - 7
National Capital Region (including ALC, AANDC LSU, ADAG/ Consultation Secretariat) 7 1 5 13
Total Files 22 24 5 51

3.7. Case Studies

The AAP evaluation included five case studies for the purpose of providing a more nuanced analysis of the legal services being offered by the AAP, as well as their associated complexities. Due to solicitor-client privilege, the Evaluation Division was also responsible for extracting pertinent information on selected case study files. While the case study focused on activities taking place from 2008-09 to 2012-13, a few files were initiated prior to this time frame because of the long time to complete the file. Each case study involved three to five interviews with participants closely associated with the files. Interviewees included Portfolio counsel, counsel from other Justice sections, and client representatives. A total of 20 case study interviews were completed. Where feasible and appropriate, small group discussions were conducted.

3.8. Limitations

A few limitations associated with the evaluation were noted. Key methodological limitations have been listed below by line of evidence.

3.8.1. Literature Review

The literature review focused on landmark court cases, key Aboriginal legal policy decisions and salient northern development legal issues that were significant in terms of changing the face of Aboriginal law and legal policy in Canada with regard to Aboriginal peoples. Locating detailed information on less notable court cases proved difficult as there were few publically available sources and the information that was available across different sources was inconsistent. Pinpointing beginning and end dates in connection with key cases was challenging at times.

3.8.2. AAP Document Review

It is important to note that many changes to AAP’s organizational structure and related business processes were implemented during the evaluation period. As such, the generation of critical documents, and use of reporting mechanisms and implementation of performance monitoring activities, for example, were not consistently available for each year of the five-year evaluation period making it difficult to methodically track changes in certain areas or as pertaining to certain issues. As such, AAP documents were used to provide snapshots or samples of the types of documents, reporting and mechanisms that were being employed between 2008-09 and 2012-13.

3.8.3. iCase

Overall, iCase was a useful source of information for the evaluation. There were, however, some limitations. Minimal legal risk information was available concerning advisory files, as the Department did not require counsel to assess legal risk on all advisory files.Footnote 7 A finding that is supported by the Department of Justice's 2008 report, Legal Risk Management in the Department of Justice - Formative Evaluation: Final Report, which states, "litigation files are more likely to have risk assessments than other areas of legal practice"(p.iii). Additionally, the 2008 evaluation noted that different risk assessment tools were being used to ascertain level of legal risk, which might help to explain the high numbers associated with the "not estimable" (NE) category across time, and overall, for advisory files, and also as pertaining to litigation files (though decreasing numbers associated with the NE category were noted over the five year window). For this reason, a thorough analysis of legal risk and complexity trends was not feasible. Any findings reported on in the current report with reference to level of legal risk and level of legal complexity are to be interpreted with caution.

3.8.4. Interviews, Case Studies and Survey of Legal Counsel

The interviews with key informants and case study participants, as well as the survey of legal counsel, have the possibilities of self-reported response bias and strategic response bias. Self-reported response bias occurs when individuals are reporting on their own activities and so may want to portray themselves in the best possible light. Strategic response bias occurs when the participants answer questions with the desire to affect outcomes.

It is also important to note that the AAP was undergoing a period of transition at the time of data collection, including a reorganization of the HQ sections and efforts to increase efficiency. Thus, respondents’ perceptions may have reflected activities taking place just beyond the evaluation period (fiscal years 2008-09 to 2012-13). For this reason, findings from the survey alone are to be interpreted with caution.

3.8.5. File Review

The file review was limited to the extent that only a small portion of possible files was selected for review. To obtain a representative sample was not feasible. Instead, the evaluation relied on the Evaluation Working Group to select files that reasonably represented the Portfolio’s work.

3.9. Mitigation Strategy

The mitigation strategy for the methodological limitations was to use multiple lines of evidence. The evaluation gathered information from the Portfolio and those using the Portfolio’s services, from management and practitioners, from a review of files, iCase data and a literature review. The mitigation strategy also included using both qualitative and quantitative data collection methods to answer evaluation questions. By triangulating the findings from these different sources, the evaluation was able to strengthen its conclusions

Date modified: