Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


Managing Cloud-Based Systems

Share and Manage Data Effectively

Figure 1

U.S. government agencies have long operated in silos when it comes to data. Different agencies tend to keep their data in different databases or archives, with little to no sharing among them.

That disconnect can lead to inefficiency and breakdowns in communication, which in extreme cases could have catastrophic consequences. One way to combat the problem is to move data to a private, secure cloud environment.

Phil Jackson, chief strategy officer, Front Porch Digital

By now it is common knowledge that the cloud offers the ability to centralize data and streamline processes and workflows. Moving to the cloud has already helped thousands of organizations in the private sector be more agile and efficient, and share and collaborate more easily, even when their data requires a high degree of protection and security. The same can be true for government agencies.

For example, a private cloud is a way for agencies to add disk space without having to buy more disks—an attractive selling point considering that many agencies are adding petabytes of data to their archives every year. With most cloud implementations, organizations pay for only what they use and can easily add more storage space without having to go out and secure it.

The right cloud implementation allows for seamless integration into shared storage, generic collaboration across many diverse applications, and usage options with existing internal and hosted software applications. Furthermore, private cloud implementations of the caliber needed for government would go beyond this simple file system to include a content storage management system, which is critical to the success of private, secure cloud environments.

Why is CSM so critical? Because adopting a cloud implementation solves only part of the problem—that is, storing massive amounts of data. It doesn’t solve the issue of access, management and archiving of that data.

In other words, having a cloud only means there’s a larger, less-costly basket into which the eggs are consolidated. Without a CSM-type system in place, agencies could still face challenges related to seamless, federated access to their content.

Adopting a CSM system will ensure that agencies can maximize their investment in the cloud.


A CSM system (Figure 1) is the software abstraction layer that automatically retrieves content from any storage infrastructure — whether it is a disk, a datatape library (with the aid of a robot), an optical archive, or any other form of storage — from any connected agency, delivering it anywhere it might be needed.

Originally designed for managing big-data, high-value assets in the media and entertainment industry, CSM systems help government organizations cope with what would otherwise be an overwhelming volume of data, address the specific complexities of that data, and facilitate smooth integration with existing operations.

A feature-rich CSM system not only enables efficient, sophisticated workflows, but also is agile enough to cope with rapid change and sudden fluctuations in demand for capacity. Cloud-based CSM takes advantage of the cloud’s unlimited storage space and computing power.

Cloud CSM can provide all the features of a physical CSM system without the infrastructure investment and overhead costs, which is almost always a consideration for publicly funded government agencies. Several federal agencies have begun moving their data into the cloud so that the different organizations can communicate with one another and actively share information.

In the intelligence community, where there are 16 separate agencies, this initiative is called ICITE (Intelligence Community IT Enterprise). The Department of Defense has a similar initiative called Joint Information Environment, which attempts to bring together data for all branches of the U.S. military, the Coast Guard, the National Guard, and more.

Though these initiatives are well underway, the ongoing challenge when sharing data from the various agencies is that each agency has a different data structure and way of tagging the data. What’s important to one agency is not necessarily important to another. Also, the taxonomy and nomenclature can vary. One agency might call something X, while another calls it Y, and the two agencies have no way of knowing when they’re comparing the same things—or when they’re not.

And therein lays one of the big sticking points with government clouds.

In order to make the system work to its full potential, the agencies need a reliable way to manage and compare their never-ending labyrinth of metadata. New technology is available that does just that.


Designed to derive maximum value from metadata, the new technology automatically extracts and indexes textual and machine-generated metadata, wraps multiple data sets into AXF objects, and creates file-relevant data tags that enable rapid, intelligent retrieval of files archived on tape. It integrates seamlessly into an agency’s cloud-based workflow, operating transparently behind the scenes to harness and manage metadata in four distinct phases (see Figure 2):

Figure 2

Capture:Centralizes storage of all files in one secure location.

Process: Extracts all possible information from saved files and makes any type of source data available to every authorized person involved in the workflow.

Manage: Captures and organizes unstructured data. The user interface and data processing can be customized with widgets and small, low-cost plug-ins to control core data.

Retrieve: Searches and restores any original corresponding artifact from stored AXF objects, regardless of location or original tool used in creation.

With this new technology, each agency can work with whatever tools and processes it prefers, while ensuring that authorized users government-wide can leverage that critical data. The central storage and accessibility of the data means users can get direct access to the metadata that come with the files, thus reducing the need for data movement and errors in data transposition.

Agencies that adopt this new approach to metadata management can greatly improve their ability to share information with other agencies, and at the same time:

• Radically reduce the time it takes to find information.

• Future-proof the metadata.

• Eliminate multiple copies of big files.

• Ensure standards and structure will evolve to align with technological innovation and changes in workflows.

When it comes to the inefficiency and complexities involved with sharing information among government agencies, the cloud can be a big part of the answer. It provides a centralized repository where agencies can collect information from various sources, tag metadata, and access data from each other.

Once that cloud is in place, a cohesive metadata management system makes it easy to capture, process, manage and retrieve critical metadata—to ensure nothing gets lost in translation.

Phil Jackson is the chief strategy officer for Front Porch Digital in Lafayette, Colo. He can be reached at