Report on a Review of Direct Report Practitioners 2012–13

Report on a Review of Direct Report Practitioners 2012–13

Introduction

Overview

Objective

Scope and methodology

Results

Opportunities for improvement

Conclusion

Appendix A—System of Quality Control Elements

Appendix B—System of Quality Control Elements and Process Controls Reviewed

Introduction

1. The Office of the Auditor General conducts independent audits that provide objective assurance, information, and advice to Parliament, territorial legislatures, and Canadians. The Office has several product lines, including performance audits, annual audits, and special examinations. Performance audits and special examinations are referred to as direct report engagements.

2. The Office follows The Canadian Institute of Chartered Accountants (CICA) assurance standards and Office policies to guide the conduct of its work. These standards and policies are specified in an audit manual and various other audit tools, which guide auditors through a set of required steps, including a system of quality control.

3. For each audit product line, there is a product leader at the assistant auditor general level, whose primary function is to provide leadership and oversight, and to contribute to the quality of individual audits in that product line.

4. The Practice Review and Internal Audit (PRIA) team conducts practice reviews of direct report practitioners to assess their compliance with professional standards and Office policies. We conduct this work in accordance with the monitoring section of the CICA Handbook—Quality Control for Firms that Perform Audits and Reviews of Financial Statements, and Other Assurance Engagements (CSQC–1). We also worked in accordance with the Office’s
2012–13 Practice Review and Internal Audit Plan, which was recommended by the Office’s Audit Committee and approved by the Auditor General. The plan is based on systematic, cyclical monitoring of the work of all practitioners—that is, the principals of each audit.

5. This report summarizes the key observations emerging from the practice reviews of the direct report practitioners selected in the 2012–13 fiscal year.

Overview

Objective

6. The objective of practice reviews is to provide the Auditor General with assurance that

Scope and methodology

7. We planned to conduct practice reviews of eight direct report engagements that took place in the 2012–13 reporting period: seven performance audits and one special examination. This report summarizes the observations from seven of these engagements; one review has been deferred to the 2013–14 fiscal year, as extraordinary circumstances rendered the practitioner unable to participate.

8. Our reviews included an examination of electronic (TeamMate) and paper audit files. We examined audit files related to the planning, examination, and reporting of the audits. We interviewed audit team members, quality reviewers, and other internal specialists, as appropriate.

9. We reviewed the seven report engagements in terms of the SoQC (Appendix A), focusing on selected elements and key process controls that we considered key or of high risk (Appendix B).

10. For each of the engagements under review, we applied one of the following ratings to each selected SoQC element and process control:

11. After completing each practice review, we assessed whether the audit report was supported and appropriate.

Results

12. We found that all the audit reports were supported and appropriate.

13. We also found that five of the seven practitioners demonstrated overall compliance, in all material respects, with professional auditing standards and the Office’s audit policies. One of the practitioners had an overall rating of compliant but needs improvement; one practitioner had a rating of non-compliant. The rating of non-compliant resulted from a lack of separation of the roles of preparer and reviewer of many audit documents, and from a lack of documentation of formal approvals by the practitioner at key points in the audit.

Opportunities for improvement

14. When the audit files we reviewed showed areas where practices could be improved, we discussed the opportunities for improvement with the responsible practitioner and their assistant auditor general.

15. We have no common observations on this cycle of direct report practitioner reviews that represent opportunities for improvement in audit practices generally, or in the Office’s methodology, or in the communication of this methodology to practitioners. We have no recommendations for either the practice leaders or the Professional Practices Group as a result of this cycle of practitioner reviews.

16. In the files we reviewed this year, we observed two practices that audit teams should take note of:

Conclusion

17. We conclude that, of the seven direct report practitioners we reviewed,

18. We are issuing no recommendations, as there is no indication of practice-wide or methodological deficiencies that must be addressed.

Appendix A—System of Quality Control Elements

Diagram showing objectives, levels, and elements of the System of Quality Control

[Appendix A—text version]

The System of Quality Control elements are described in Appendix B.

Appendix B—System of Quality Control Elements and Process Controls Reviewed

Our review covers the following System of Quality Control (SoQC) elements.

Leadership. We reviewed evidence of whether individuals working on the audit had received an appropriate level of leadership and direction and whether adequate supervision of all individuals, including specialists, had been provided to ensure that audits were carried out properly. As well, we considered overall audit quality.

Ethics and Independence. We reviewed whether the independence of all individuals performing audit work, including specialists, had been properly assessed and documented.

Acceptance and Continuance. We reviewed whether the adequacy, availability, proficiency, competence, and resources of the audit team were appropriately assessed and documented.

Human Resources. We reviewed whether and how tasks were assigned to the audit team members.

Engagement Performance. This element includes supervision and review, consultation, engagement quality control review, differences of opinion, and engagement documentation. We reviewed whether the audit was planned, executed, and reported in accordance with Canadian generally accepted auditing standards, applicable legislation, and Office policies and procedures. We considered whether the Office meets its reporting responsibilities by having in place appropriate audit methodology, recommended procedures, and practice aids that support efficient audit approaches to produce sufficient audit evidence at the appropriate time.

We reviewed whether consultation had been sought from authoritative sources and specialists with appropriate competence, judgment, and authority to ensure that due care had been taken, particularly when dealing with complex, unusual, or unfamiliar issues. We also reviewed whether the consultations were adequately documented, and whether the audit team took appropriate and timely action in response to the advice received from the specialists and other parties consulted.

We reviewed whether the quality reviewer carried out an evaluation, in a timely and objective manner, of

We reviewed whether the work of the quality reviewer was adequately documented and whether the audit team had taken appropriate and timely action in response to the advice received from the quality reviewer.

We also reviewed the finalization of audit files. For special examinations, we determined whether audit files had been closed within 60 calendar days after the date of the assurance engagement report, and for performance audits, within 60 days of the date of tabling.

Other observations. When we saw opportunities to improve the design and communication of the SoQC or to disseminate good practices and opportunities for efficiency in the conduct of the audit, we informed the Professional Practices Group (PPG) and the Product Leader.