As Agencies Move to the Cloud, it’s Time to Prioritize Sharing Over Siloes

Legacy systems are the vampires of federal IT systems, sucking up financial and manpower resources and draining federal agencies’ ability to modernize. Since 2010, two initiatives – “cloud first” and data center consolidation – have served as the primary tools for addressing the drain caused by these obsolete systems.

A third initiative, not designated as such but a consequence of the first two, is finding ways to kill legacy applications, two officials suggested during a panel at the MeriTalk Cloud Connect conference.

Walter Bigelow, the Bureau of Alcohol, Tobacco, Firearms and Explosives (ATF) division chief, IT Services Management, and Dominic Sale, the General Services Administration (GSA) deputy associate administrator for information integrity and access, said that eliminating little-used apps make a real difference.

“If it costs $50,000 a month and three people are using it,” look at axing it, Bigelow said.

“I’m excited about going through these application scrubbing conversations,” Sale agreed.

“We are identifying a little more clearly the operating costs of [legacy] systems,” Bigelow said. “We’re reducing the manpower and freeing up money” previously used to support them in order to reinvest in modernization efforts.

Bigelow said ATF uses several clouds. “Why pay a 15% premium for GovCloud for a public-facing app?” he said. “Public cloud has good enough security.”

Sale said he doesn’t see cloud security as the top-of-mind first concern any more. “There’s a lot of analysis paralysis” over all the different flavors of cloud – software as-a-Service, platform as-a-Service, infrastructure as-a-Service, and so on, he said. “Hit the easiest stuff and just go up the chain.”

During a second panel discussion, on the multi-faceted cloud approach, Greg Capella, deputy director of the Department of Commerce’s National Technical Information Service (NTIS), said moving to the cloud is “really about getting all the constituents to buy in … [T]hink about procurement rules. If procurement puts out a fixed-price contract for 12 servers but the money authorized can now buy 14, what do I do?”

Syed Azeem, senior IT project manager for the Department of Labor and advocacy and outreach lead for the Federal Cloud Center of Excellence, agreed.

“One of the challenges is not so much technical but the acquisition rules, contracting, budgeting,” Azeem said. “So we try to understand the challenges agencies are facing on their journey to cloud” and provide information that helps.

Azeem observed that agencies have up to four decades’ worth of data centers built up, based on organizational structures.

“It shouldn’t be a one-to-one relationship between ‘I have a data center and I have a specific mission,’” he said. “There’s a strong bias for private cloud because that’s what people are most comfortable with.”

But in an information-driven time, data, mission, and services are likely to cut across agencies, added Sam Capone of CSRA.  In order to continue delivering on an increasingly complex mission with smaller budgets than ever before, all siloes – be they data, infrastructure, applications or funding – need to be broken.  “Agencies need to use their move to the cloud, not to build more silos, but to build infrastructure that enhances the next-generation of mission-focused solutions, like shared services,” he concluded.

Related Posts