Since the beginning of the pandemic the digital transformation needs of federal agencies have only increased. Not only are agencies responding to citizen and worker demand for more remote services akin to those they use in their daily lives, but they’re also responsible for responding to changing mission requirements. Driving digital transformation at government agencies is no simple task. We spoke with Nick Psaki, Principal Technologist at Pure Storage, whose experience as a systems engineer and twenty years of service in the Army have given him key insights into how agencies can be successful at reducing the burden of technology transformation by uncomplicating data.
Government Technology Insider (GTI): What are the biggest challenges agencies face when implementing a modern data infrastructure?
Nick Psaki (NP): The five most important things that government agencies have to assess when evaluating a new technology are security, simplicity, scale, sustainment, and speed. If the new technology doesn’t meet these criteria and successfully address them, then it’s not something that a government agency can readily adopt and employ in an enterprise environment.
When a government agency makes a purchase, they are making long-term investments that are designed to be sustained for decades. For those legacy investments to change, there must be a significantly compelling event.
Environments have evolved tremendously. Instead of all data and apps being hosted in government data centers, there are now well-established mechanisms for adopting a third-party infrastructure and cloud infrastructure from major cloud providers.
The truth remains that the government has been building data centers and technology infrastructure for as long as it has existed. The amount of data involved is massive. In some cases, there’s a statutory mandate to maintain data for decades or even centuries. The complexity of having to move that data from one agency to another, such as the National Archives and Records Administration, can be overwhelming.
All of these factors create a tremendous amount of diversity in the environment. Trying to uncomplicate that is a preeminent effort among government agencies that has driven the adoption of cloud service infrastructures.
Technology today has created the capability to leverage data in ways that were not possible even ten years ago. We pioneered the methodology of providing for the deployment of physical infrastructure on-premises using the cloud economic model. By doing this, we reduce the burden of technology transition and transformation. Using this method streamlines the acquisition mechanics involved in consuming technology in a way that supports traditional acquisition and leverages technology in both on-premises and off-premises deployment.
The beauty of technology modernization is that the products themselves become simpler to operate and less complicated. By democratizing this process, you can drive government infrastructure, drive application infrastructure, deliver services, and leverage data for agencies to conduct their operations without having expert knowledge in technology.
Enterprise infrastructure has made tremendous advancements over the last ten years by implementing simpler technology that enables agencies to operate more efficiently than ever before. As technology improves, it should become less challenging to implement. You should have a situation where you are driving your technology, and your technology is no longer dictating your behavior and implementation.
GTI: What are some unexpected challenges you’ve seen agencies face in their modernization efforts?
NP: For many agencies, it’s been an absolutely wrenching cultural shift to transfer to cloud-based data services. They’ve grown accustomed to owning and controlling the infrastructure, data, and processes. As a result, it can be a bit jarring for them when they move to an as-a-service model where they no longer own the infrastructure, and they don’t get any say in where the data is placed. I’ve seen resistance to cloud adoption because it doesn’t adhere to how government technology has procedurally and institutionally operated for decades.
The pace of technological change has far outstripped most institutions’ ability to adapt to it. When I started at Pure Storage nearly eight years ago, the whole idea of all-flash or solid-state storage was seen as exotic and defined by niche capability requirements for high performance computing and high velocity data services.
As the amount of data increased, so did the demand to leverage that data. The adoption of flash storage fundamentally transformed the environment. But there are still a lot of people in various agencies who regard it as something that is still only suited for a very small set of use cases, even though it provides universal benefits for all use cases.
Technology transformation moves at a pace that is difficult for large organizations to adapt to. The scale at which new technology must be implemented within agencies simply outstrips reality. Moving large amounts of data takes time, resources, and sometimes even physical relocation. It’s a complex process that requires a finite set of criteria and time to complete. The process could be faster today if you leverage cloud infrastructure because it’s universally addressable. However, connecting the various agencies together still takes a lot of work and a lot of time.
Learn how to uncomplicate your data forever here.