By Barbara DePompa
, 1105 Government Information Group Custom Media
While storage management can sometimes be difficult to define, the ultimate goal for many federal IT managers is to reduce the cost and amount of manual labor involved in managing storage, backups and archives.
Here are a few tips from industry experts to help optimize storage:1) Employ technology to automate processes, lower costs. Federal IT organizations can gain enormous investment returns on tools that assess under-utilized storage subsystems.
The primary benefit: identifying additional available storage, which can dramatically reduce the need for storage subsystem upgrades to meet growing storage requirements. Such tools can also root out configuration problems within SANs, yielding more available storage. Large organizations with storage arrays tucked in all corners of the data center will likely see the most benefit from these tools. Data deduplication, for example, can help policy managers organize data storage, move data to lower storage tiers as the data ages, prevent premature data deletion, and securely delete data when its retention period expires. 2) Use audit trails and access controls.
Compliance requires regulated access: ensuring that only authorized personnel can access files and make sure that changes to data are closely tracked. When evaluating a compliance tool, audit and access features to prevent unauthorized changes or deletions are important. Even the activities of authorized users should be closely monitored and recorded, so an activity trail evolves. Auditors and lawyers can then follow the trail to ensure compliance. Also, it helps to simulate periodic audits and use compliance tools to address mock discovery requests. Training helps employees prepare for audits and can reveal weaknesses in tools or procedures that should be updated.3) Data classification requires diligent attention.
Storage administrators in the large federal organizations have learned that without stronger management of the entire backup process, it's impossible to tame data growth regardless of the data reduction or protection tools employed. Administrators must find ways to leverage a classification system, and divide structured and unstructured data based on applications being backed up. Identifying data owners throughout an agency is also a necessity to assess storage performance requirements, and find out how long specific data must be retained.4) Investigate storage-as-a-service.
This alternative is currently being delivered by suppliers such as DISA, as a way to more accurately account and charge for the storage and computing resources used by different parts of an agency. Some agencies also plan to adopt price lists and service-level agreements similar to the commercial cloud services provided by industry suppliers. Government Computer News reported recently that within phase I of DISA's Rapid Access Computing Environment (RACE), in which agency customers purchase access to a full computing environment on a month-by-month basis at a base price of $500 per server, additional storage capacity has been one of the most requested functions, along with a standardized backup service.5) For low-priority storage, consider the cloud.
While cloud computing storage may be widely touted as the next big thing, most industry observers caution against making large investments quickly, citing security, availability and possible cloud storage location problems that will require further vetting. There are a few places in which cloud storage makes sense now for federal IT organizations, including the archiving of non-critical data and storage for end-users who spend most of their time out of the office, for example. Organizations that have completely run out of data center space should also consider cloud storage solutions for lower priority data storage requirements.