The Future Enterprise, Part 4: Working Smarter
Today, IT administrators have to sort through statistics tracking thousands of network, storage and computing devices and software programs to help them decide what resources to adjust or to diagnose problems. The data center of the future will be self-optimizing. By using powerful analytics to detect patterns in data in motion, an amazing array of advancements can be made across industries.
While the purpose of this series is to talk about the future of enterprise computing and how advances and technology are moving us along a certain path to that future, if we stop for a moment and look at the recent history in enterprise computing, it's clear that we've come a long way in a few short generations.
Looking forward again, it seems obvious that advancements in the ability to increase automation of the management of computing systems will drive the next generation of efficiency gains.
Today, even though corporations have sensors in all of the devices in the data center, IT administrators have to sort through statistics tracking thousands of network, storage and computing devices and software programs to help them decide what resources to adjust or to diagnose problems. The data center of the future will be self-optimizing, relying on analytics capabilities to understand how the systems are working or, if they're not working well, to figure out how to improve the systems in order to make them work better.
IT directors of the future will need only to set goals and rules for operations based on their business priorities. With these goals and rules in place, machines and software programs will be monitored in an integrated manner. The feedback received by IT managers can then be streamed real-time into software tools, making adjustments automatically based on their knowledge of the resources available, the requirements of the business applications, and of the goals set by their human administrators.
Intelligent Data Centers
This shift to intelligent data centers is an absolute necessity. The rapid adoption of virtualization technologies for servers, storage and networking means that each piece of equipment can have anywhere from dozens to even tens of thousands of discrete tasks running on it -- and shifting from one machine to another as needed. That's entirely too much activity to handle manually.
Also, today, IT directors buy much more computing, storage and networking capacity than they really need so that they're covered when there are peaks in demand. Using smarter management systems, they'll be able to use what they have more efficiently -- and put off additional purchases of equipment until they really need it.
Another key element is the management of data. These days, much run-the-business data is captured and stored in separate technology silos. Financial data is housed in accounting systems. Customer service information is in CRM systems, and orders are captured in transaction systems. This practice causes a duplication of effort and can create a lot of confusion, making it difficult to understand such basic things as who the customers are and what they want. Making things even more complex, all sorts of new sources of data have emerged that need to be reckoned with. Everything from sensors, cameras and the vast Internet itself creates and captures new data that needs to be sorted, analyzed, stored and dealt with.
To address these challenges, organizations must think of data in a whole new way. Just as all of the server computers and storage devices in an enterprise's data centers can be handled as a single large system, so can the data itself. This approach is called "virtualized data services" -- a technology platform for managing, integrating and analyzing information from both new and traditional data sources.
When a corporate IT department transforms data into data services, the business' employees have a much easier time getting their hands on the information that they need when they need it. Workers can now simply ask their software applications for specific data, or certain types of information, and it is delivered directly to their desktops -- simply and quickly. Every person receiving and handling that data doesn't need to know where the data resides, and they can feel confident that it's accurate and up to date.
This ease of use, and security in the integrity of their data, makes it possible for people to reap much more value from the data that companies gathering and storing at great cost. Using analytical tools for forecasting, predictive modeling, and business optimization, businesses are putting their data to use.
By analyzing trends, predicting future activities based on past results and observing correlations between different activities, companies are using data to meet business goals and grow their business. Also, it becomes possible to capture and analyze massive streams of data gathered real-time.
By using powerful analytics to detect patterns in data in motion, an amazing array of advancements can be made across industries. Banks can better detect and prevent fraud in credit card transactions, hospitals can make sense of live streams of vital signs for patients in emergency rooms and critical care units, and city governments reroute traffic to avoid messy pile-ups, to name just a few. All of these scenarios can be, and are being, accomplished through automated IT systems and improvements in the way we gather and digest the vast amount of data collected every day.