The comment period is closing on the self-assessment framework that agencies will use to measure their compliance with the mandatory Technology Business Management accounting standard.
A government-industry team is looking to wrap their work on an IT cost accounting assessment before the end of the fiscal year as federal agencies plug along on deliverables for the Year One Action Plan under the new Federal Data Strategy.
Government officials and industry representatives working through the ACT-IAC organization have been developing a self-assessment tool for federal agencies to measure their progress adopting the Technology Business Management, or TBM, framework developed by private sector chief information officers to match IT spending to specific business outcomes.
Federal agencies are under mandate to adopt TBM accounting standards by 2022, per the Office of Management and Budget, and are required to purchase TBM tools through the General Services Administration and get approval from the TBM Task Order Review Board, or TORB.
But in order to ensure this is more than an exercise in compliance, OMB included an action item in the Federal Data Strategy entitled “Improve Financial Management Data Standards,” with four milestones targeted for completion in 2020. Among those is a charge to “develop an IT spending transparency maturity assessment model,” headed by the CIO Council’s Federal Technology Investment Management Community of Practice and supported by ACT-IAC.
The draft assessment includes 69 questions divided into six categories: engagement, taxonomy, data, automation, reporting and metrics, and value. A post on the ACT-IAC website explains the reasoning behind each:
A successful implementation involves engagement from many people across an organization and none of this is possible without management, leadership and service owners commitment. TBM Taxonomy adoption provides a mechanism to approach the other dimensions. Having meaningful data is core to creating actionable information through reports and metrics. The magnitude of data changes frequently and would be difficult to manage in a manual manner. Automation provides a means to keep it up to date and is less error prone. Ultimately, in the final dimension, TBM increases value enabling business outcomes and the organization to operate more effectively.
But both the post and the assessment itself show the transition takes more than just turning on new accounting software.
“Implementing TBM doesn’t happen overnight, rather it matures over time as data quality improves and metrics that drive decisions are developed and utilized,” according to the ACT-IAC post. To account for this, the model asks respondents to assess the agency’s current progress toward meeting each goal, as well as a separate rating for planned future state.
The final assessment document is due by September 30, and Tuesday is the last chance to comment on the draft before it’s finalized.
The request for feedback is broken into 10 questions:
- How will the tool be put into practice?
- Who is the audience for taking the assessment, sharing the results and taking action?
- Will the tool provide sufficient information in order to develop an action/project plan for improving maturity—and value add—between review cycles?
- Will this maturity model fit the needs of an agency evaluating their progress on TBM implementation and maturity?
- There is a statement next to each dimension to set up the context/perspective for the assessment responses. Does the context statement resonate sufficiently?
- Currently the ‘Future State’ time period is open ended. An organization can respond based on their own planning horizon—6, 12, 18 months. Should it remain open ended or should we suggest annually or some other frequency?
- Do the statements within each maturity dimension fit within the context of that dimension? If not, please suggest where you think they would better fit.
- Are there any outstanding topics which impact TBM maturity and are not addressed by the assessment?
- Is it easy and intuitive to use the tool?
- Do the definitions of the evaluation criteria provide sufficient detail?