Digital transformation and a better ability to harness the power of public sector data has been on the to-do list for government agencies for a while now, with initiatives like the Data Center Optimization Initiative (DCOI) and Federal Risk and Authorization Management Program (FedRAMP) steadily pushing these efforts forward. But as the COVID-19 pandemic continues to impact “business as usual” because of social distancing, many agencies are fast-tracking those modernization efforts to sufficiently support a remote workforce and their corresponding data management needs.
During the AFCEA Monthly Breakfast Webinar in April, Stuart McGuigan, CIO at the Department of State, shared some perspective on the rapid transition to teleworking in the public sector: “It was a little bit of a surprise to all of us just how quickly we can move. We can move quickly without compromise on controls, without compromise on security posture, without compromise on all those things we need to do to make sure things are up and working for our people.”
An increasingly remote workforce means it’s imperative to optimize agency data management strategies and leverage cloud-powered solutions to make data as accessible and agile as possible. This approach was recently discussed in a webinar on MeriTalk titled, “Improving the Value of Data with a Modern Data Platform,” where industry experts Tim Garrod, Analytics Data Architect at Qlik, and Chris Tran, Senior Sales Engineer at Snowflake, showcase the immense benefits of a modern cloud platform in tackling public sector data management challenges.
During the webinar, Garrod and Tran both touched on the importance of not only data access, but fresh data access. In order to glean actionable insights from data in service of their respective missions, agencies need to ensure access to a single source of accurate data. Cloud-powered platforms are vital to that level of access and accuracy.
However, being able to effectively apply these modernization strategies is just as, if not more important, than the implementation itself according to Tran. “Many government agencies today are looking to modernize their legacy data strategies,” he shared in an interview with Government Technology Insider. “I would caution those efforts not simply be made for the sake of modernization, but to derive real value from change. Those changes must provide a simple, flexible, and scalable ability in an ever-changing data landscape. As systems are able to ingest limitless amounts of data, they must also be able to provide accessibility in a way that makes sense for the masses, not just the developer.”
“Modernization isn’t just a new term for re-platforming,” Garrod added during our discussion. “We have to start thinking about a new way to build and manage our data supply chains from data acquisition to transformation and delivery. When we do that it doesn’t take long to realize that foundational components to modernization are real-time data, code automation and platform scalability.”
Tran and Garrod also cover areas like digital accountability and transparency (both inter- and intra-agency), instant elasticity, and the seamless ability to scale with agency needs, and the importance of a ready-to-use data exchange. All of these elements speak to how cloud-powered data management reduces cost, complexity, friction, risk, and time-to-value.
To learn more about the cloud’s profound impact on enabling the public sector’s remote workforce, download the webinar with Garrod and Tran for on-demand viewing here.