Is your Data Center Being Challenged to Store, Manage and Protect Petabytes of Data?

In today’s ever-changing world, businesses are faced with an increasing amount of data to store, manage and protect. This technological fact-of-life subsequently raises many questions for IT teams:

  • How do I ensure my data is available but secured?
  • How do I efficiently manage the workloads and meet the needs of the end-users?
  • How do I control all of this within budget restrictions?

And when you are a large federal agency, the difficulties intensify.

Such was the case for a high-profile federal customer with a mission-critical need to improve their storage infrastructure while supporting back-up, recovery and continuity of operations crossing two environments: high-performance and capacity-based. The agency had previously handled data replication by way of a complex, off-box system that required multiple levels of redundancy and checkpoints.

Not unlike other federal IT teams tasked with the need to store, manage and protect petabytes of data, the agency faced many challenges; however, their goals were clear:

  • Protect data at risk due to backups taking too long;
  • Streamline file-sharing operations;
  • Mitigate risk of increasing costs;
  • Adhere to compliancy mandates.

Security has long been a top concern for federal agencies tasked with protecting highly sensitive information. But as more agencies move to the cloud, they are sharing their learnings along the way. Lt. Gen. Alan Lynn, director of the Defense Information Systems Agency, shared at DISA’s Forecast to Industry event Monday, that software-defined networking makes it “easier to defend [a network] if you can build multiple, equal networks that are identical in a row. And then if you have an attack on one of the networks, you can fold that network and move your users over to the next network.” On the data center side, virtualization allows the Pentagon to “spin up a capability whenever we need it,” Lynn explained — again leading to cost savings and added speed.

IronBrick is no stranger to understanding federal IT needs and challenges. After a comprehensive evaluation of their federal customer’s existing IT environment, IronBrick engineers designed and implemented a leading-edge data center storage solution with NetApp technology.

The solution included NetApp Cluster Data ONTAP (cDOT), and NetApp FAS storage controllers offering the IT efficiency, business agility, uptime and simplified data management required to meet mission critical activities. The agency’s performance-critical tasks could now be accomplished without interruption while it dynamically assigned, promoted and retired storage resources and improved service levels, all in a secure environment.

Additionally, two NetApp components were integral to the project’s success. By using SnapMirror, agency data could be replicated on a schedule without extra hardware components, thus improving storage efficiency and available capacity of the primary and secondary storage devices. Likewise, on-box deduplication increased the storage efficiency and available capacity of the primary and secondary storage devices enabling growth while maintaining performance.

The new technology accomplished the agency’s goals by providing:

  • Increased speed of access to end-user files (for user shares, home directories, etc.);
  • Performance enhancements and improved turnaround time between requests and provisioning;
  • Simplified backups;
  • Reduced operational overhead – transitioned 400+ individually managed Windows File Server VMs to a single NetApp system.

“Our solution was a simple and elegant approach that cut out multiple processes and technologies,” explains Steve Miller, IronBrick’s Director of Consulting. “This enabled the customer to recoup cost, space, power and cooling while providing an extremely effective service to its end users.”

Related Posts