Data and analytics have been important to federal agencies for a long time now. But after the COVID-19 pandemic brought about permanent structural change in the federal government’s IT infrastructure, data and analytics are becoming central to mission success. But as much as becoming a data-driven organization brings with it many advantages it also creates many questions. Questions about how data should be used, who should have access to that data, and even whether or not agency workers have the right skills needed to be able to get the most out of this data.
For Mike Potter, CTO of Qlik, these questions about how federal agencies can become truly data-driven organizations are top of mind in 2021. The mass pivot to the cloud and to ‘as-a-service that the federal government undertook at the beginning of the pandemic to continue to deliver on the mission has, from Potter’s perspective, made being nimble, agile, and responsive the new norm for agencies. “The wide deployment of SaaS and cloud-based systems has drastically accelerated their plans for wider deployment of SaaS and cloud-based data and analytics models,” he shared.
Whether it’s in making permanent a data-driven supply chain model that was first used to manage the supply of PPE during the pandemic or supporting the roll-out of self-service analytics many agencies, led by the Department of Defense, are investing heavily in cloud-based data and analytics. “Relying on legacy on-premise data environments eliminates tremendous scalability and cost-savings benefits, along with the nimbleness,” he explained. And, as an added benefit when agencies embrace these cloud-based data and analytics solutions it frees up resources – both budget and people – to solve other mission-critical challenges.
While Potter is enthusiastic about this pivot in the federal government, he recognizes that for many organizations, this acceleration also comes with “inherent cultural challenges.” These challenges coalesce around opposites. “On the one hand agency workers aren’t data ready and then, on the other hand, we have to break down the idea of technology specialists being the data gatekeepers,” he explained.
The key to success is for the C-suite to drive that change by being willing to “tear down the perceptual and conceptual barriers that pigeonhole workers into binary categories of those who are educated in data and those who aren’t,” explained Potter. The democratization of data definitely presents challenges when it comes to data security, sovereignty, and governance, but there are many solutions and approaches that can bridge this gap. “There are great advances in augmented intelligence embedded into analytics platforms to enable nascent users to explore data more confidently and successfully,” he shared. “There’s also been tremendous growth in data literacy initiatives and free training that can improve data skills democratization.” By using these tools and working with trusted partners on training it is entirely possible to create a culture that’s fueled by, data, where all workers become instrumental in identifying process improvements and finding new ways to use the data to deliver on the mission.
Change – unless apparently driven by the demands of a global pandemic – is rarely easy for any organization, let alone one as complex as the federal government. However, the last year has proven that change is possible. The structural changes – such as moving to the cloud and embracing an as-a-Service approach – catapulted federal agencies far into the future and well into a new pattern of definition and growth. The enormity of these technological changes means that the cultural change – breaking down those silos, sharing the keys to the data kingdom – aren’t polite wishes, but rather inevitable. “As much as we may initially think of COVID and the post-COVID world as a time of contraction and retrenchment, it is also a time for redefinition and growth,” concluded Potter. “It’s the moment to embrace new data and analytics models to drive the mission forward.”