Today’s government data center is, in all likelihood, an overwhelming environment. With miles of racked servers generating enough heat to fry an egg, multi-tenant environments with differing requirements, more capacity needed every day, and far fewer staff available than are needed to run the data center efficiently, data centers are bursting with opportunities for innovation. But where to start?
In the video below, Duke Butler, Director of Worldwide Field Marketing at Brocade, discusses how reducing complexity in data centers can assist government agencies manage the coming data storm and actually become more efficient despite increasing the number of devices that generate data and the amount of data to be stored and retrieved. Counterintuitively, the solution does not lie in adding more capacity in the data center, but in understanding where the bottlenecks lie and tackling those problems to reduce complexity and improve operational efficiencies.
While this might sound like a marketing sound bite, if not for the fact that virtualization technologies are the keys to managing complexity and alleviating bottlenecks while capping expenditures. It is essential to virtualize the entire network, not just the devices that are connecting it to. Government agencies need to look for solutions that flatten the network, use soft switches, and employ sophisticated Ethernet fabrics. A combination of these virtualization strategies not only reduces the physical capacity needed, but reduces data center cooling costs, staff costs, performance latency, and even the time to deployment.
It sounds too good to be true, but more and more organizations are embracing virtualization to bring their data centers into the 21st Century. To learn more about data center virtualization watch the video above.