Storage

Guide to Storage Virtualization

Seven things IT personnel should know about storage

Avoid performance problems with these new tricks

By Beth Schultz, Network World, 06/06/07

Thanks to virtualization and a host of other technologies, storage has left its silo. Its performance affects the whole computing shebang. Fortunately, new technologies that cross the boundaries of storage, management and compliance are smoothing over performance issues and easing the pain (and expense).

But you've got to be in the know to make use of them. Here are seven storage truths that every IT person should understand.

Optimizing storage isn't about buying new stuff , says Mark Diamond, CEO at storage-consulting firm Contoural. It's about determining whether the data you've created is stored in the right place. This discussion goes beyond the basic concept of using inexpensive disk to store data and delves into how the disk is configured, especially when it comes to replication and mirroring.

"We typically see that 60% of the data is overprotected and overspent, while 10% of the data is underprotected -- and therefore not in compliance with SLAs [service-level agreements]," Diamond says. "Often, we can dramatically change the cost structure of how customers store data and their SLAs, using the same disk but just configuring it differently for each class of data."

Read the full story here >>

Users who get great performance out of their storage-area networks (SAN) have discovered application-centered monitoring for storage performance.

For instance, the Affinion Group is testing a combination of Onaro's Application Insight and SANscreen Foundation monitoring tool. "We could be alerted in real time of any performance spikes and hopefully be informed of any issues that could cause an outage, before someone calls from the business line," says storage specialist Raul Robledo. "We wouldn't need to get inquiries or notification from individuals. We would be getting those right from a product that's monitoring our environment."

A host of other products have entered the category of storage optimization, too.

Read the full story here >>

Storage isn't the biggest energy hog in the data center, but new technologies can still help cut back on its power consumption by as much as 20%, users say. Even using storage space more efficiently can cut down on wasted capacity, experts say. This means spending less on storage in the long run.

At San Diego Supercomputer Center, Don Thorp, manager of operations, looked to Copan Systems, one of a handful of relatively new, smaller green storage vendors. He reports that storage consumption is down by 10% to 20% since switching to Copan Systems last July.

Many more such vendors are entering the market.

Read the full story here >>

Over the last several years, numerous vendors have taken backups from boring to remarkable by rolling out fancy backup-management tools.

Spun off from the broader storage-resource management market, these tools, of course, monitor and report on backups of products from multiple vendors. But they also give IT administrator an at-a-glance picture from a single console, in real time and historically. They can ease the auditing process and help create chargeback programs verify internal service-level agreements for backups.

Heterogeneous backup-management tools are available from various niche vendors and the mainstream storage biggies.

Read the full story here >>

Just ask the University of Florida College of Veterinary Medicine (UFCVM). Over the last six months, the college has been putting its 7TB storage area network through its paces, using it for nearline backup and primary storage.

UFCVM relies on Storage Virtualization Manager (SVM), a virtualization appliance from StoreAge Networking Technologies, now owned by LSI. The SAN setup reduced backup times by half, and the project came in under budget, says Sommer Sharp, systems programmer for the college in Gainesville, Fla.

Provisioning is a painless matter of moving volumes to any server that needs it, so live data can be managed as easily as backups.

Read the full story here >>

Recent surveys show that, on average, U.S. companies face 305 lawsuits at any one time. With each lawsuit comes the obligation for discovery -- production of evidence for presentation to the other side in a legal dispute. With 95% of all business communications created and stored electronically, that puts a heavy burden on IT to perform e-discovery, finding electronically stored information.

In the U.S. court system, the onus of e-discovery took on new weight on Dec. 1, 2006, when amendments to the Federal Rules of Civil Procedure (FRCP) took effect. "With the amendments to the FRCP, the courts are saying, 'We know the technology exists to do this stuff. We want to see you take some reasonable steps to put processes and technologies together to do e-discovery. And if you don't, we're really going to hold you accountable for it,'" says Barry Murphy, principal analyst at Forrester Research.

He cites the recent case of Morgan Stanley vs. Ronald Perelman, in which Morgan Stanley was hit with a $1.57 billion jury verdict, which hinged primarily on the company's lax e-discovery procedures.

Read the full story here >>

The Open Grid Forum, a standards organization focused on Grid Computing, is working on a variety of standards for the compute, network and storage infrastructure, all the way from describing jobs to being able to move and manage data, says Mark Linesch, who heads the organization.

Work is progressing around defining a grid file system and naming schemes, and developing a storage resource manager for grids. The group is collaborating with other standards bodies like the Distributed Management Task Force and the Storage Networking Industry Association.

The ultimate goal is to enable proprietary storage vendors to make their gear interoperable.

Read the full story here >>

 

Subscribe to the Power Tips Newsletter

Comments