As corporate data accumulates at an exponential rate, the work of millions of employees, produced on computers keystroke by keystroke, has produced a virtual monster of a problem. The growing heap of computer files, which includes companies’ most sensitive data, leads to a daunting question that can no longer be ignored: Where in the world can enterprises store data when ordinary hard drives and conventional backup devices start bursting at their seams?
Fortunately, a solution is out there. IT departments are implementing what are known as SANs, or Storage Area Networks. Here is why you should care about the mysterious SAN and the secrets it silently holds:With networked storage, a user anywhere in an enterprise — armed with the proper level of access, of course — can search for any piece of data stored within the enterprise.
Although SANs are responsible for storing and safeguarding companies’ deep, dark secrets, they also can hold data as mundane as users’ desktop icon preferences. But there is another fascinating aspect of the SAN technology — it interacts with each workstation within a company, in a sense “vacuuming out” files to a central storage area.
This functionality can lead to unwelcome or embarrassing situations if not carefully configured. Nonetheless, a SAN can serve as the file system super-backbone for the enterprise, and understanding its technology will serve you well, whether you are in charge of purchasing a SAN or are an end-user whose files are being stored on a SAN.
Pulling It All Together
The hallmark of top-notch SANs is that many technologies are strung together to create a formidable network with the sole purpose of serving as a secure backup and storage area for the company’s main network files. SANs are made up of high-performance servers; backup tapes; CD-ROMS; huge, lightning-fast hard drives; and specialized software to automate and monitor the data backup and storage process. The requirement to pull together all these different technologies accounts for the cost of SANs (which can be enormous) and differentiates various SAN vendors.
According to Giga Information Group analyst Rob Enderle, the big players in the SAN space right now are EMC, Hitachi and IBM. However, because a SAN is so complex and involves so many different components, Aberdeen Group analyst David Hill told the E-Commerce Times that even those three giants may purchase some of the components required to build their systems from such companies as switch manufacturer Brocade Communications Systems and McData, a maker of data traffic directors.
Communication, SAN Style
A good Storage Area Network must be able to connect to as many different technical devices as possible. This includes being able to communicate smoothly with desktops, servers, laptops, CD-ROMs and other devices, so that no matter where data is created originally, it can be backed up by the SAN. Of course, the SAN not only must be able to receive data from a wide variety of clients and devices, but also must be able to send data back to the same variety of machines.
Even within a SAN, a heterogeneous environment with multiple products from multiple vendors may exist — posing a tough challenge for network administrators. Earlier this year, to promote interoperability, IBM and Hewlett-Packard announced they would share application programming interfaces and work to develop open standards for data storage technology. The agreement was designed to let HP’s OpenView software manage IBM’s TotalStorage server and, conversely, to enable IBM’s software to manage HP’s systems.
Safeguarding Corporate Data
One of the biggest challenges for a SAN is to make data quickly available to the “right” people, while keeping it safely locked away from others. According to Meta Group program director Phil Goodwin, SAN security is often overlooked because of the limited nature of early deployments, and because of a false belief that stored data is completely protected behind firewalls.
Goodwin told the E-Commerce Times that programs from Brocade and McData can supplement existing authentication processes and are particularly helpful in preventing clever outsiders from wandering where they should not. “The idea is to stop someone who is trying to spoof a server to get at stored data.”
He added that a number of technologies, generally known as fabric security products, also have been designed to address potential vulnerabilities in storage networks. While they boost security, they also require companies to make tradeoffs: Data is more securely protected, but the overall system becomes less interoperable with those of other enterprises. According to Goodwin, technology firms are working on ways to maintain interoperability while improving security.
Apart from determining proper access to data, a big part of a SAN’s functionality involves what is affectionately known as system redundancy. If there is a problem with the network and part of it is down, the SAN should have alternate backup copies of files, so that no matter what, there is always a copy of the data available for retrieval.
Toward that goal, SANs use a professional-grade hard drive technology, known as RAID (redundant array of inexpensive disks), which is a series of extremely fast hard drives that constantly copy data from one disk to the next, making multiple copies of the same data. These hard drives can be removed and replaced at a moment’s notice, without shutting down the entire system, because they are hot-swappable, meaning the server can run without skipping a beat while disks are exchanged.
Costs Add Up
With their myriad hardware and software requirements, SANs can be so complex that companies may consider purchasing consulting services from SAN vendors. Even seasoned IT professionals can benefit from advice and assistance provided by specialists whose entire focus is on making sure that the components of a specific SAN system interact seamlessly. But all of this can send costs skyrocketing. To help companies pay for a SAN system and the necessary consulting services, vendors like IBM even offer financing options.
For enterprises that cannot afford SANs, other network-attached storage options exist at lower costs. For example, 97 cents per megabyte is the going rate for Quantum’s Snap Server 2200, a dual-drive 160 GB storage disk. Dell offers the 480 GB 715N model at 93 cents per megabyte. Hewlett-Packard’s NAS S1000 starts at 85 cents per megabyte, and IBM — a late entrant into the low-cost NAS arena — sells the 480 GB TotalStorage NAS 100 at 92 cents per megabyte.
The Bottom Line
The bottom line is that a SAN is an increasingly critical aspect of enterprise operations. “IDC estimates that one-third of disk storage last year was deployed in a networked environment and projects that two-thirds of disks will be networked by 2005,” Michael Zisman, general manager of storage software for the IBM Storage Systems Group, told the E-Commerce Times.
Simply stated, if your company produces data that is worth keeping, then a system must be put in place to accomplish data storage tasks. But SANs can be very costly and complex. As with any other infrastructure purchase, a company should determine objectively what its needs are, and how much of a good thing is simply too much. One good rule of thumb: Determine the potential cost to the business of lost data, and then protect against that risk.
Math tends to get challenging when storage capacities are growing from megabytes to terabytes and prices are plunging. The prices in this article, by my calculation, are off by two orders of magnitude. These NAS servers are priced under a penny a megabyte, not 90 cents or more as the story suggests. It was not all that long ago when a dollar a megabyte was a pretty good price.