Originally published in the October 2018 issue of Career Education Review, and reproduced here with permission. Article originally posted at careereducationreview.net
Over the summer, I have been reflecting on the impact of the September 2016 Guide for Audits of Proprietary Schools and for Compliance Attestation Engagements of Third-Party Servicers Administering Title IV Programs (Guide). During the reflections, I have considered “mindset” and how it can impact the audit experience. Personally, as a runner, some pain and expected discomfort are normal. Knowing this enables you to plan and train for these effects by establishing the appropriate mindset depending on your personal goals. Similarly, having the right mindset is important for auditors and institutions to find solutions to make the audit process more efficient and less onerous under the Guide. Difficult projects don’t get easier by putting them off or ignoring the known impacts. Ensuring an efficient audit process requires the right mindset for institutions and auditors. Institutions and auditors must realize the difficulties the Guide posed in some areas, be realistic as to the risks it presents and understand the measurable outcome metrics in order to establish improvement goals. Successful organizations actively plan and set goals for improvement, and that is the approach which is necessary for your institution’s compliance audits under the Guide.
All for-profit post-secondary institutions have been subject to their first annual compliance audit under the Guide. As was expected, the Guide added significantly more audit work, increased the hours necessary to complete the audit, and delayed the issuance of the reports. During July, our firm spent time quantifying the hours and the impact on our firm and to our clients. We believe that institutions and auditors need to measure quantifiable metrics to set specific goals to improve audit efficiencies and to issue audit reports earlier. Before we look at these metrics, let’s review what we learned in the new audit areas versus our expectations in regard to the amount of testing performed and the audit results.
The Guide increased testing or added testing in four broad audit areas:
- 90/10 calculation
- student eligibility (student sample size, student confirmations)
- gainful employment
- institutional eligibility/administrative capability (Annual Security Report/crime statistics, placement rates, servicer contracts and written procedures, and incentive compensation).
Let’s review these four key areas as compared to our expectations. We will then review overall numerical statistics specific to our firm for compliance audits performed under the Guide. As a side note, the Office of Inspector General has posted Questions and Answers pertaining to the Guide. We believe these are important for institutions to review to understand any clarifications to the Guide.
90/10 Calculation
The 90/10 calculation was the only significant portion of the Guide to impact financial statement audits. The Guide indicated that the 90/10 calculation must be compiled on a student-by-student basis, compiled by the institution, and the auditor must report any error uncovered along with the auditor’s opinion of the correct 90/10 rate. The amount of testing we performed was higher than prior years, as expected. Our total time to audit the 90/10 calculation increased from 381 hours for the July 1, 2016, to June 30, 2017, time-period to 664 hours for the July 1, 2017, to June 30, 2018, time-period. Our testing was more detailed than in prior years since many institutions implemented new systems and processes to compile the 90/10 calculation in order to comply with the new requirements. We advised our clients to compile the 90/10 calculation in advance of year-end, so almost 100 percent of our clients had compiled the calculation on a student-by-student basis. However, due to the detailed nuances of the calculation, while many clients had set up the process and tested the calculation in advance of year-end, we still uncovered some quirks which needed to be cited as a finding. While these findings were, for the most part, immaterial variances, we were required to cite them since the Guide allowed for a zero-error tolerance. These errors mainly related to the treatment of application fees and the related payments as institutional charges, treating alternative loans as meeting the presumptive rule, and state grants not treated as meeting the presumptive rule. These are items which should be easily corrected by ensuring the fund source mapping and the setup of the various charges are correct in the student information system.
Student sample size and confirmation
The second major area of the Guide related to student sample size and student confirmation. We anticipated an increase in file testing hours because the sample size increased from 75 to 120 for larger institutions. Of this increase, for institutions whose drop rate was below 33 percent, 35 of the additional files were withdrawn or dropped students who inherently have a higher risk of error. Small schools were not exempt from additional testing as, under the Guide, if the withdrawn or dropped population was 25 or less students, then the auditor needs to test the entire population. Our expectation was that the file testing time would be 50 percent more than the prior years’ time. Our audit results reflected that our student file testing time increased approximately 35 percent over the prior year results. Testing increased substantially whether this work was done in our office or at the institution. The additional files also resulted in institutions spending more time responding to open items, questions, and taking corrective action for findings. A significant increase in findings resulted especially related to R2T4 calculations, post-withdrawal disbursements, exit interviews, late refunds, and verifications. In addition, since we were testing an increased number of student files, the resulting findings had increased instances of student errors and less findings only had one to two students cited. Institutions should review their policies and procedures around “hot” ED issues such as verification, R2T4 calculations, timeliness of refunds, etc. to determine if an internal second review process can be implemented to prevent or limit these critical findings.
While the student file testing results were as expected, the student confirmation process was actually better than expected. The time incurred by us was less than anticipated, and we averaged 2-3 hours per audit. A team at our office developed an efficient process that enabled us to send the confirmations electronically using Word, Excel and Office. Once we received the appropriate student demographic information, the process was relatively automated and seamless. In addition, we utilized a dedicated email account to send and receive the confirmations. While the process to send confirmations was smooth, the confirmation response rate was abysmal. We expected very low response rates. As such, we decided to perform the alternative procedures (attendance records) during our student file testing which minimized subsequent follow-up with institutions. Overall, the entire confirmation process took < 5 hours per audit versus the 10 hours we had expected. Institutions should review their student information systems to ensure the information needed for the student confirmations can be easily generated or determine if changes are needed to the system in advance of the next audit cycle.
Gainful Employment
The third major area under the Guide related to gainful employment. This was an area which had previously been unaudited and, as a result, the expectation of the necessary audit hours was somewhat of an unknown. We anticipated approximately 10 hours to test the gainful employment reporting and the gainful employment templates. Our actual results were closer to 15 hours per audit. As we will reflect later in this article, gainful employment resulted in a number of audit findings and was a “rough” audit area. Obtaining the gainful employment reporting files from NSLDS was a relatively smooth process. However, many institutions had a finding related to gainful employment reporting. These findings were mainly related to programs not reported (mostly discontinued programs), incorrect CIP codes, incorrect tuition and fees, unusually low tuition and fees, and incorrect classification of institutional debt versus private educational loan debt.
Similarly, obtaining the gainful employment disclosure template (GEDT) supporting information wasn’t overly burdensome, but the audit results reflected numerous findings. Frequently, the GEDT had incorrect information related to off-campus housing costs, incorrect interest rate, incorrect cohorts for on-time completion rates/median loan debt, on-time completion rate based upon 150 percent instead of 100 percent of the program length, and incorrect cohort for the percentage of borrowers.
The final area related to gainful employment was the required warnings. This area took less time than expected as we audited very few programs which failed the debt-to-earnings rates in year one. For most programs that had failed the program was already taught-out, in-process of being taught-out, or a RGEES appeal was performed. As such, we didn’t have many instances in which we had to audit the gainful employment warnings process.
In light of the proposed changes to gainful employment regulations, we are evaluating what testing will be required if the regulations become effective prior to Nov. 1, 2018. In the interim, we will continue to perform the testing as required in the Guide. Institutions should continue to follow the gainful employment proposed regulations and discuss the potential impact with their auditors.
Institutional eligibility and administrative capability
The final area of the Guide which we anticipated would increase audit work was institutional eligibility and administrative capability. A number of new audit areas were added or expanded in these sections. The first being the required testing of the Annual Security Report (ASR) under the Clery Act disclosures. While this wasn’t specifically in the prior audit guide, we had begun testing the ASR over a decade ago as we deemed this to be a risk for institutions. ED was auditing the ASR and the Clery Act disclosures during program reviews, and it was communicated at conferences that ED deemed this a critical item for an institution’s compliance. Therefore, we believed it was an important area to audit even though it wasn’t mentioned in the prior audit guide. Therefore, adding this item to the Guide didn’t affect our audit work. The only new audit step was the testing of the institution’s attempt to obtain crime statistics from their local law enforcement offices. Overall, the testing of the ASR was consistent with past audits, and the audit results were consistent.
The Guide required auditors to test placement rates when they are used in advertising. This item caused us some concern realizing that placement rates have been a “hot” issue for regulators, consumer groups, and state attorneys generals. In addition, we weren’t 100 percent certain how many of our clients actually utilized placement rates in the advertising process. From our experience in testing placement rates as part of an accreditor’s employment verification process, we had concerns in the amount of time we would spend in this area. However, we have had very few clients who advertised placement rates. Many of our clients see this as a large risk due to misrepresentation and, as such, advertising of placement rates is not done. Thus, the impact on our audit hours was minimal. If an institution does advertise placement rates, we recommend that detailed information is maintained to provide to the auditors.
The Guide also required the auditors to expand upon the internal control documentation. In conjunction with this item, auditors had to obtain all servicer contracts to ensure the servicer contract properly stated the functions performed and ensured the servicer contract was in accordance with CFR 668.25(c). We have developed and maintained internal control memos/checklist on our audits for over 20 years, so this was not an area which caused a significant change in our audit time. We were able to revise and expand our internal control memos/checklists and overall the time was consistent with past audits. We did incur more time obtaining the servicer contracts and reading through them, and the time spent was consistent with our expectations. From reviewing this information, we did have some management comment letter suggestions for institutions.
The final key area under institutional eligibility/administrative capability related to incentive compensation testing. The Guide significantly expanded the required testing related to an institution’s employees and also required the auditor to review third-party contracts for entities who perform recruiting services or award Title IV aid to ensure these entities are not paying incentive compensation. Increased audit hours were necessary to test the institution’s admissions and financial aid employees which included reviewing additional documents (i.e., annual review) and perform some interviews to ensure nothing of value was provided (i.e., tickets, trips). We also had to review third-party contracts and discussion with management whether third-party entities were providing recruiting services or awarding Title IV aid and to document the results of this testing in a memo. The hours incurred were similar to past audits as our incentive compensation procedures have been fairly extensive in past audits as well. Institutions should review all third-party contracts in advance of their next audit to ensure the contracts are complete per the federal regulations. In addition, if the contracts are provided to the auditor early in the audit process, the auditor can review this information in advance of the fieldwork.
Now that we have reviewed these key issues, let’s look at the audit results. Improvements can only be made to the audit process if specific measurables are tracked, analyzed, and new goals established.
As a reminder, auditors may group or categorize findings differently. We may have grouped various student instances together in one finding (i.e., Pell overaward/Pell underaward, NSLDS transfer monitoring/NSLDS change of status) while another auditor may have grouped these instances as separate findings. Therefore, the statistics below are relevant only as a comparison of our work year-over-year. Key metrics from our June 30, 2017, and Dec. 31, 2017 audits are as follows:
- Increase in average audit hours from a little over 100 hours to almost 160 hours.
- Increase in median audit hours from approximately 90 hours to 140 hours.
- Total findings increased from 148 (67 compliance audits) to 399 (94 compliance audits).
- Increase in average audit findings from 2.25 to 4.25 findings. Increase in median audit findings from 2.00 to 4.00 findings. (This includes some audits with zero findings which skews the average and median results).
- A gainful employment report finding occurred in 25 percent of reports and a gainful employment disclosure template finding occurred in almost 70 percent of reports.
- The average number of days reports were issued in advance of the statutory deadline (six months after fiscal year-end) decreased from just over 60 days to just under 30 days.
- The average number of days reports were issued after the last day of fieldwork increased from just under 32 days to just over 33.5 days. (This metric didn’t change as much because we had a number of audits for which fieldwork was completed in May or June and thus the reports had to be quickly issued.)
A couple of years ago we started tracking the last two metrics as our clients normally want audits issued well in advance of the statutory deadline and having the audit completed as soon as possible after their year-end. As a firm, our goal is to improve these metrics beginning with the June 30, 2018, audits. I am sure our clients will be happy if we can issue the compliance audits earlier. Institutions can be proactive to assist with making the audit cleaner and more efficient. Before your next compliance audit, consider these three action items. One, debrief with your auditor. Have discussions as to what worked and what didn’t, where delays occurred, and where the process can be improved to be more efficient. Set a timeline for the auditor that includes sending of the information request checklist, receipt of documents from the institution, and issuance of the compliance report by the auditor which is realistic for both parties. Consider performing some testing in advance of year-end especially if the institution has a Dec. 31 year-end. Two, review the information gathering process. Gather information such as ASR, servicer contracts, completion/graduation rates, bank statements, monthly Federal Direct Loan reconciliations, etc. as a large portion of the institutional eligibility and reporting information is available prior to year-end. Be able to provide as much of the information on the auditor’s request checklist at one time makes the process smoother. Third, review the information on key reports. Test the 90/10 calculation, review the ASR, and review policies and procedures either internally or using a third-party in advance of the audit to minimize the risk of a finding.
Ignoring the impact of the Guide won’t make for an easier audit in future years. Successful organizations solve problems by understanding the impacts, analyzing the results, and making proactive changes to improve the process. The right mindset yields the desired outcomes.