Speakers at the ATARC Cloud Computing Summit in Washington, D.C., last month made the case that federal initiatives aimed at consolidating and optimizing data centers are cloud computing initiatives as well.
Dominic Sale, the General Services Administration’s (GSA) deputy associate administrator for information integrity and access, said that putting data centers at the top of the agenda for a cloud summit was “a very insightful move.”
The Federal Data Center Consolidation Initiative (FDCCI) was first established by the Office of Management and Budget in early 2010. The topic was included in 2014’s Federal IT Acquisition Reform Act (FITARA); data center optimization “is buried in the middle of FITARA,” Sale said.
Now a reboot of the FDCCI is under way. The Office of Management and Budget issued a draft memo in March inviting public comment on a proposed framework for meeting FITARA requirements, the criteria for successful agency data center strategies, and the metrics OMB will use to evaluate the strategies’ success. The Government Accountability Office (GAO) will be required to verify what agencies claim as their data center inventories, Sale said.
It’s a four-point strategy, he said – first is to freeze each agency’s current footprint and future data center expansion; move to close data centers; move agencies to the cloud; and promote the use of shared services.
GSA has been assigned some important responsibilities in the draft memo, Sale said, which the agency is assuming will be included in the final version when it’s issued. Sale’s office will have to define what “significant expansion” of an existing data center means, qualify the requirements for shared services, monitor those shared services and the standards for operating them, establishing online support tools for evaluating them, and establishing online acquisition tools.
“Strategies like the ones described by Sale will greatly help agencies meet their data center consolidation and optimization goals,” noted Joe Kim, SVP and Global CTO at SolarWinds. “What this enables is cost reduction through efficiency gains, and infrastructure can be taken off the books altogether – it’s a holistically more sustainable way to operate,” he said.
At the State Department, Melonie Parker-Hill, division chief of the Enterprise Operations Center, said the department has established a modular data center to facilitate sharing services in-house. “It allows us to meter [use] at the equipment level and [we have] metering tools to capture real-time information,” she said. The department has a private in-house cloud, which “we compared to the private sector and believe we’re pretty competitive” on price and service offerings.
One agency that has embraced the cloud-first, shared-services mantra is the Agriculture Department. USDA’s Chuck Gowans, Chief Architect, Enterprise Data Centers, works at the department’s National Information Technology Center (NITC) in Kansas City, a Tier 4 facility that serves both the 29 agencies and offices of USDA, but a number of other federal departments and agencies.
“[We’re] a provider of government-only cloud services,” Gowan said. “Our customers primarily subscribe to the cloud, offered as a service and a platform-as-a-service, [and] we’re FedRAMP certified.”
NITC offers its PaaS for a wide range of environments, including UNIX, AIX, even mainframes, he said. “We understand that consolidation is difficult enough, without having to rewrite applications.”
Gowan said agencies share an almost universal misunderstanding of FedRAMP. They think it’s “nirvana,” but it’s not. “It only applies to infrastructure, not the operating layers, [so we] help agencies understand why they have to go beyond FedRAMP to have secure networks,” he said.
For Kim, another important reminder for data center managers is “that agencies shouldn’t move to the cloud just to “re-host.” He encouraged IT leaders to “take advantage of all the benefits of the cloud, such as bandwidth elasticity, improved responsiveness, and flexibility and identify and employ IT management and monitoring tools that make these benefits accessible.”
The most noteworthy speaker of the ATARC event may have been John Messina, a computer scientist at the National Institute of Standards and Technology. He spoke on developing modular sections for Service Level Agreements (SLAs) built into cloud computing contracts.
NIST examined a number of SLAs from different agencies to try to identify common elements that agencies could draw upon as templates for best practices, Messina said.
“We found wide variation among SLAs – like apples to oranges to footballs,” he said. “You can have two identical cloud services, but with vastly different SLAs. [Currently] a customer has to look at each SLA as a one-off; if they go look at another cloud provider, they have to draft brand new SLAs.” In many cases, identical words and phrases have different meanings, but the customer agency and the cloud provider don’t even realize they aren’t talking the same language.
This is a big problem, because misconceptions about what SLAs do and don’t include often lead to contract disputes, he said. “You don’t want vocabulary and definition of terms causing headaches. You need modular standards [and] building blocks,” Messina suggested.
As a result, NIST is working on establishing that common vocabulary, a modular set of reusable components, and a catalog that will include ready-to-use metrics, along with guidance documents and examples. The new standard is up for international approval now, he said.
NIST will hold half-day workshops on the ISO/IEC 19086 series, “where government agencies can bring their SLAs in and learn how to map to the new modular standards,” Messina said.