Our internal audit reporting workflow

I have been meaning to write an extended article on how we develop our internal audit reports for a long time. Given that I’m out with a quite heavy flu and my voice is completely gone, this is as good a time as any to write this post. I will describe some of the key challenges we faced in developing our reports, the structure we created to answer these challenges and the tools we currently use. In other words, I will be describing our current process flow.

Now, you need to take in account that this process flow is not enitrely optimized yet. I’m still striving for the ideal situation where we can achieve the key objectives of our audit reporting with the least possible re-hashing of information. I’m hoping to incorporate this later in a markdown-LaTex based workflow. We’re not there yet, and for now, this is how we function.

I also need to point out we used some of the currently available audit management tools. While they may be highly relevant for traditional internal audit functions, we need to be able to trust our systems in environments where internet access is something which is not guaranteed. In addition, I’m not too wild about the limitations in terms of process flows each of these tools imposes on the internal audit one way or the other.

Challenges

First, let’s be very clear. Writing an internal audit report is probably one of the most difficult challenges an internal auditor has to face. Even if your audit execution went flawlessly, and you have everything documented in your workpapers, it still takes a lot of hard work.

Showing your work

Reporting is essential to our trade because for most of the “beneficiaries” of internal audit activities, the internal audit report is the only deliverable they will ever see of our work. In my past, I’ve done quite a few peer reviews, and one thing I noted was a tendency to over-report. To provide too much information on the work done, rather than on the conclusions reached. I think this is the subconsciousness of the internal audit report writer at work who aims to show the work done. Of course, that is not the purpose of the report.

Getting the message across

A good report should be about getting the key findings of the audit across to a rather diverse field of audit report audiences. If you want to get the message across, you need to be able to address multiple audiences with the same report. That is a significant order.
Let’s examine some of the audiences we need to report to:

  • The first and foremost audience of the internal auditor is the audit committee. In a lot of the discussions on adding value as internal audit this key audience appears to be forgotten. For true internal auditors, the members of the audit committee are and remain the main audience. They each have specific strengths in their individual fields of expertise. However, they are not necessarily versed in the finer nuances of everyday operations in your organisation. That’s logical, they are not part of the day-to-day activities. However, this has a significant impact on the internal audit report. It means that the internal auditor, in order to make his reporting relevant, needs to pay particular attention to the readability. Facts and findings need to be stated in such a way that they are understandable for each member of the audit committee.
  • The second audience is the group of auditees, in itself often quite a diverse group. Contrary to audit committee members, these are very well versed in the language and the practices of your environment. They may consider a simplification of their reality an entirely inappropriate way of presenting their challenges. As an internal auditor, you also need to cater to their needs. Even though the reporting requirements may be completely opposed to the reporting requirements the members of the audit committee have.
  • The challenge of good audit reporting does not end there. A good audit report contains recommendations that may be useful beyond the limited scope of the area under audit. It’s very important that the audit report, if it contains recommendations, describes these recommendations in a way that makes them usable outside of the limited area of the initial audit.

Other issues that sidetrack auditors writing audit reports

Auditors are quite often sidetracked by aspects of report writing that should not be relevant at all, such as layout, formatting etc. This may well distract them from paying appropriate attention to the content, to the story.

A good workflow does not distract the auditor from the content, but allows him to focus on it. We’ve developed a workflow which helps us in achieving this focus.

Essential aspects

Standardized Structure

To make a report relevant, the reader should be able to find his way in it. The best way to achieve that is to make it as recognizable as possible. We’ve been working hard on a stardardized structure for our audit reports. Note I write standardized and not standard.

An audit report needs to be recognizable, but not to the point that the form obliviates the content. To me, as Chief Audit Executive, an audit report works if it suits the purpose for which it was written. This means it needs to respond to the requirements of all audiences, which we briefly discussed above. Using a template with a structure which is recognizable, with specific minimal requirements in terms of what at least should be present without limiting more in-depth descriptions of the findings and their context works for us.

Short and concise

A second requirement is that the report needs to be as short and concise as possible, but not shorter. Of course, that’s easier said than done, but we always keep in the back of our minds that it takes more effort to write a shorter letter. We actually ask ourselves two questions for every phrase in the audit report:

  • The first question is “Says who?” This question queries whether we are very clear on the source as well as the trustworthiness of the information that we are sharing. This helps us ensure that the information that finds its way into the audit report is correct. If we cannot source a finding or a clarification of a finding correctly, we need to reexamine the finding and perhaps take it out of the report.
  • the second question is “So what?” By asking this question, we make sure that this information really matters to our readers. If the point does not contribute to either the evidence, the finding itself or a related recommendation, we remove it.

Tools I use

All that is of course quite ambitious. If we had to keep this in mind in an entirely manual process, I’m not sure we would be able to do it, given our small size. Luckily, we have a number of tools that allow us to do this in what we consider to be the most efficient way possible.

Basecamp

The first tool we use on a daily basis is 37 signals Basecamp. While its use is not directly related to our audit reporting, it is one of the cornerstones in our workflow. Basecamp is our auditing hub where we develop, share and execute work programs, upload documentation such as files and scans, write dispositions of internal audit on audit tests executed, and exchange any type of information. Its very simplicity allows us to focus on the work at hand. And the fact it is so simple makes it easy to export for example work programs into a simple text file and carry it with us wherever we go. We’ve developed a standard work program template which is very easy to copy and adapt for the purposes of the specific audit, without forgetting all the traditional to-do’s that are required in every audit.
The way we’re working means that at the end of the audit field work, prior to reporting, all to-do have been documented out and checked off from their respective to do lists. The audit conclusions, which are audit findings, remain in a to do list for easy transport to the next step in the process. We also maintain a separate to do list per project for all the recommendations. Hence, anything that needs to find its way into the audit report will be present in the to do list in Basecamp.
Throughout the audit, documented in Basecamp, we identify the threads of our report. But even a bunch of threads a report do not make. So we need to do something with these threads.

Mindnode Pro

Even with a good work program in place, we often find that findings are sets of disparate statements, lose threads, about issues raised during the audit. And disparate findings need to be connected to make a comprehensible whole for the audience. A consistent story needs to be woven. But in order to weave, you need a pattern.
Mind mapping software, such as Mindnode Pro, is an excellent tool to put all those distinct pieces together into one whole. We weave our initial pattern with this excellent mind mapping tool. Once the initial pattern is established, we use OPML to transfer this to our next tool.

Scrivener

Which brings us to the core of our report writing process: Scrivener. If our findings are the threads, and Mindnode Pro allowed us to develop a pattern, Scrivener can be considered to be the weaving machine that allows us to bring those threads together according to that pattern to make a comprehensible story.

Scrivener allows you to focus in-depth on a specific paragraph or set of paragraphs to make them sound correct, while allowing you to keep the overall view at the very same time. I most often use the “index card view” which allows you to bring those disparate threads together into a story. The process goes like this:

  • We export the pattern from Mindnode Pro via OPML.
  • The OPML export is opened in our existing Scrivener reporting template. This template allows us to maintain the standardized structure without sacrificing the flexibility to weave a story. The pattern which we developed becomes part or a larger, well known and well understood pattern.
  • All the information required to write out each of the findings is available to us through the OPML import. This information is the baseline based on which we write the individual paragraphs of the report. The tool allows us to focus on content and content only. We do not need to worry about lay-out or format.
  • We use the index card view to look at the flow of the story. If the findings are not structured well, the pattern will make no sense, and the structure of the story needs to be reviewed. The index card view makes this easy. You can easily rearrange entire parts of the story without having to rewrite them.
  • Once I’m happy with the overall flow, we export the whole to a Microsoft Word file for formatting purposes.

Given we need to have this draft report reviewed by multiple auditees before we can make it a final report, we still need to go through Microsoft Word. I hope one day to be able to replace this entirely with a combination of Markdown and LaTex to generate a PDF which complies with all aspects of our visual identity. As I am only starting to discover LaTex, we’re clearly not there yet. It will likely take me some evenings on a mission in Africa to make some breakthrough there.

Conclusion

Combining Basecamp, Mindnode Pro and Scrivener as tools for documenting identified threads, developing a pattern and weaving the story according to that pattern has led to a workflow which allows us to focus on the content rather than on the sometimes highly inefficient externalities of the audit reporting process. While I wish I could do away with Microsoft Word entirely (not because it is a bad tool, rather because it is an inefficient tool), we are not there yet.