Beyond Keeping the Lights On

Download the PDF:

 Click Here


Read Time:

 15 min.


Font Size:


Font Weight:

Building the Foundation for Broad Use of Analytics

Doing the Most with the IT Budget

CIOs and senior IT leaders such as directors of global applications and enterprise architects constantly face the challenge of reducing technology costs while continuing to deliver at least as much value to the business. The IT organization must always be seeking opportunities to automate the infrastructure and processes used throughout the enterprise to improve efficiency. At the same time, IT must continually enhance the application environment to modernize business processes and put more useful tools and information in the hands of those who produce business results. In other words, to ensure that CIOs and IT leaders are serving the business’s needs and satisfying its demands – and, not incidentally, maintaining their reputations – they must always be searching for ways to be not just more efficient but more effective.


Reducing the KTLO budget is critical because it releases already allocated budget dollars that then can be applied to increasing the business value IT delivers to the organization.

According to our recent benchmark research on IT Performance Management, most IT organizations (57%) find financial constraints are the most important factor influencing how well IT manages operations and resources. This situation requires IT to use its budget wisely, of course, but business conditions also force IT to constantly seek ways to minimize expenditures while delivering at least the same business value, and ideally more, from the IT infrastructure.

What’s required, in other words, is the application of business management practices to IT. We advise adopting an IT budget triage process in which the CIO tasks financial analysts with separating the budget used to keep IT operating and supporting business operations – the Keep The Lights On (KTLO) budget – from the discretionary budget used to improve business value. The KTLO portion of an IT budget can range from 50 to 80 percent of the annual IT allocation, so appropriate scrutiny may reveal where changes and investments in replacement tools and systems can reduce the annual cost of maintaining the infrastructure.

Minimizing the budget by making new investments may seem counter-intuitive to some, but examples of the long-term return on investment (ROI) realized through changes to the KTLO budget are becoming commonplace – and increasingly are a strategic focus of CIOs and IT management. Reducing the KTLO budget is critical because it releases already allocated budget dollars that then can be applied to increasing the business value IT delivers to the organization.

Where to start? Many IT organizations have begun by addressing the process of centralizing IT activities using shared services. They consolidate and standardize assets ranging from servers, storage devices, application servers and data warehouses to enterprise applications such as customer relationship management (CRM) and enterprise resource planning (ERP). This is a simple, straightforward approach that is widely applicable: Assess existing systems and tools to establish their maintenance, resource and other dependency costs, then determine potential savings from consolidating to one vendor. Such an assessment and subsequent negotiations with the vendor have saved organizations significant budget dollars.


Taking steps to improve data quality and data integration is key to gaining the cost reductions needed to reduce the percentage of budget spent on KTLO.

Another promising area of focus for cost reduction and optimization is the organization’s data and information management operations. Data is at the core of every IT organization’s information management responsibility to the business; research shows that organizations use significant portions of their budgets, resources and time managing data-related activities. Moreover, the quality of the data used by IT and available to the business is of paramount importance to supporting business objectives; bad or even questionable reports can cost time, money and competitive advantage.

Information management thus is of strategic importance, and the wise CIO, exploring how to become more methodical in using the KTLO budget, knows that establishing and maintaining the infrastructure underlying the organization’s applications and systems and structuring and managing how the data is integrated, synchronized and stored can add up to a significant burden for midsize to very large IT organizations. Taking steps to improve data quality and data integration is key to gaining the cost reductions needed to reduce the percentage of budget spent on KTLO and reallocate some of it to enhance applications or the underlying environment to better support needs of the business.

Optimization Opportunities

An approach to IT optimization that combines imposing money-saving efficiencies in KTLO tasks and systems with the addition of new capabilities that make business processes and people more effective can give IT a more strategic role and burnish its reputation among the lines of business and in the executive suite. Here are suggestions for three key areas in which IT can make such changes.

Standardize Data Archiving

Archiving is a key data-related activity that is essential to the operations and performance of the business’s applications, and it is required to satisfy regulatory compliance rules and regulations about the storage of business data. However, application-specific data archiving and other application-supporting activities require not only resources but also time to ensure that the operations have completed successfully and the data is accessible. Moreover, if the proper steps are not taken to ensure efficient compression and data storage, costs can be significant.

In the traditional model each application uses its own specialized methods to archive its data to disk and then to a backup medium for disaster recovery. These archiving processes often are inconsistent across applications, requiring specific resources to manage the interfaces and processes and enable access to and integration of the data. For IT organizations that support perhaps a dozen production applications each for three to five years, costs can add up quickly: If just half of those production applications have customized interfaces, the burdened labor cost of $200,000 per employee per year could add up to as much as $1 million. Moreover, the array of proprietary methods being used may not be as efficient and timely, and thus as cost-effective, as archiving managed through a centralized data integration technology platform.


Ensuring that data is readily available in a consistent, high-quality form to support the business is what information management is about.

Automate Data and Application Interfaces

Ensuring that data is readily available in a consistent, high-quality form to support the business is what information management is about. Unfortunately, in many instances of deploying an application – whether packaged or custom – to support a business process, IT organizations pay little attention to ensuring that the associated data will be available for use beyond the application. Today that availability is needed: Businesses increasingly are concluding that they must monitor their business applications’ processes and performance and compare them to information that is being reported by other systems. In short, there is a need to re-engineer the organization to be data-ready.

In such an environment, the business must apply data quality processes uniformly and consistently to ensure that all available data has been cleansed, augmented and matched efficiently and so is ready to integrate when necessary. The cycle times for data integration can vary, driven by need and business practices; many organizations integrate their core customer and product data across applications in real time or within a day, while other organizations that use this data for decision support may do the integration weekly. But whatever the integration cycle time, these data integration and quality tasks should be automated so they are done consistently and efficiently using a common set of resources, a step that also reduces the costs of these aspects of IT operations.

Consider, for example, an organization that relies on five core business applications: CRM, supply chain management (SCM), manufacturing, ERP with accounting, and human resources management. Two of the applications exist in multiple versions and instances, and each uses version-specific business rules that are reflected in the data. Because this organization uses point-to-point application data integration and data quality processing, IT must support as many as 25 interconnects, plus the integration of these five applications with the organization’s business intelligence (BI) and data warehouse systems. All told, this can mean as many as 30 individual processes to enable data integration and for data quality. That’s a significant amount of customization, as each instance requires specialized IT resources.

Clearly, utilizing a common process and technology can ensure that the business’s data quality and integration needs are addressed efficiently and can reduce IT operating costs. Moreover, it likely will speed up the data integration and thus improve decision support compared to what is possible with spreadsheets.


Application-focused information life-cycle management can make access to data and reintegration of it into other systems a simple, cost-effective activity

Retire Outdated Applications

Every business has not only a present and a future but a past. With each step in an organization’s evolution, merger with another organization or change in its business technology landscape, retaining the history of its business transactions becomes more important, and more complex.

An application may be retired, but leaving it behind may not be easy, since its unique business logic provides the context for all the data it has generated. Typically, this issue is addressed by maintaining the legacy application and thus the viability of and access to its stored data – but that’s a costly commitment that delivers little value to the business. A better approach would be to put in place systems that can process and archive the data associated with any application being taken offline and that ensure that the data can be accessed and used as needed in the future. These are the processes that collectively are known as application-focused information life-cycle management (ILM), which can make access to the data to support regulatory compliance and e-discovery and reintegration of it into other systems a simple, cost-effective undertaking. Embracing ILM for applications can help reduce the IT operations budget by eliminating maintenance costs for retired systems and the servers supporting them.

Consider an organization that retires an application that 2,000 employees use. The cost of maintaining the application, including operations, resources and the supporting software and hardware, is $2,000 per individual per year, an annual cost of $4 million or $333,000 per month. If the organization retires one system each quarter, the direct cost of having to continue to support it to keep the data accessible can be high – and that’s before figuring in overhead and the cost of managing resources.

The analysis for this is something any organization should do; all that’s required is an accurate cost per application per employee to determine the money and re-sources that can be saved each month through acquiring technology that can enable full retirement of the legacy system while ensuring continued access to the data to ensure maximum value for the business.


Using these strategies an IT organization can reduce its KTLO budget, often enough to free up funds to purchase tools that in turn can further refine and streamline processes in information management.

Delivering New Value

Using these and other strategies an IT organization can reduce its KTLO budget, often enough to free up funds to purchase tools that in turn can further refine and streamline processes in information management. For example, master data management (MDM) can help IT consolidate data in different formats from disparate systems across the enterprise and produce reliable data easily accessible to all authorized users. MDM can provide consistent data about customers, products, partners, suppliers and employees for users across the lines of business. At the same time MDM gives IT a single tool to replace the separate data quality utilities that come with the many applications it must support ¬– thus helping to keep the lights on in a cost-effective, automated manner.

Another such tool is data virtualization, which provides an abstraction layer that hides and handles the complexity of making many different data sources look like one. Virtualization enables IT to provide reliable information to users or in some cases lets them get it for themselves. From its own perspective, IT can make changes to the application environment without disrupting the rest of the integration architecture and reduces the amount of rework when business requirements change slightly.

Similarly, data migration can speed retirement of outdated applications while assuring that the data in them remains available in formats that today’s applications can easily consume. This enables business users to keep working with data they need without adjustment and helps IT modernize the application environment without disrupting business operations.

As organizations generate larger and larger amounts data of many kinds from more sources than ever, applications that have to use these large data volumes suffer performance slowdowns. This situation creates friction from business users and can result in IT being unable to honor its service level agreements. New tools can process and analyze this so-called big data at speeds previously unattainable. Hadoop is a large-scale processing technology that, being open source in nature, is affordable for IT departments to adopt; it can become part of the application infrastructure without intruding into operations in the line of business.

The constant pressure to reduce costs (and often head count) forces CIOs and IT leaders to look for alternatives to the traditional deployment model of installing applications on-premises, paying license fees and maintaining systems with their own employees. Cloud computing operates on a rental model and does not consume significant amounts of on-site resources to manage software. While at first this adds a layer of complexity by storing data in the cloud while established systems remain on-premises, mature integration services are available to enable IT to take advantage of cloud-derived economies in managing its budget while offering new capabilities and modes of access that users want.

How to Get Started


To start, inventory your business applications individually to establish their maintenance costs, the resources used to support each of them and who uses them.

Enabling your IT organization to reduce the KTLO budget while providing maximum support to business operations requires two things: an IT operations assessment program and a financial analysis team that can calculate and document the potential savings from process and systems improvements. To start, inventory your business applications individually to establish their maintenance costs, the resources used to support each of them and which departments and members of your workforce use them. Be sure that your inventory includes all details of IT activity costs, particularly how long it takes to extract and copy data for each category of business use, to do data quality cleansing, matching, monitoring and auditing, and to store and archive the data.

Having the cost and utilization figures makes it possible to calculate potential savings from each cost component on a quarterly and annual basis; project it out a minimum of three to five years. You then can attribute these costs (and savings) to the business units that use these resources. This is your KTLO baseline – the information needed both to determine where savings in your KTLO budget can be realized and to examine alternatives.

For a look at how this could work, consider the real example of a $2 billion electronics manufacturer that has a dozen production applications from four major technology vendors supporting customer relationships, ERP and supply chain management. The applications are from Oracle/JD Edwards, and SAP, and the data warehouse and business intelligence systems are from IBM. This company also has three legacy systems, including a mainframe with applications supporting production planning and shipping management, both of them manually integrated with the ERP and supply chain applications and the data warehouse.

Looking to reduce his KTLO budget, the CIO concluded that data management and processing for the applications and the data warehouse were candidates for cost reduction. After performing an inventory and cost assessment, the CIO found considerable savings opportunities. One was to implement application-to-application data integration between Oracle and SAP systems. Others included integrating data from the mainframe and the data warehouse, retiring the JD Edwards applications and standardizing data archiving across all systems. By seizing the opportunities, the CIO was able to save more than $15 million in the first year alone.

This is just one example of how addressing opportunities can save your organization a significant amount of its KTLO budget – an amount that can range from hundreds of thousands of dollars to tens of millions ¬– while maximizing IT’s ability to support its business users. Using technology to automate data-related infrastructure activities is a core component of an optimized KTLO budget; merely standardizing processes without putting in place a common technology would be counterproductive, since it then must be maintained using costly specialized resources (which can be risky since that individual could leave). With a common data technology, an organization can take an integrated approach to managing its data infrastructure, concentrating resources and reducing required time; lacking it, the organization inevitably will continue to duplicate efforts and expend resources in performing common activities iteratively using custom interfaces and application-specific tools. As has been the case when organizations have streamlined the operations of their data center, the streamlining of data processing can deliver valuable savings and so can impact the KTLO budget even more rapidly.

The KTLO cost reduction team you put together will require application and data management knowledge as well as the ability to perform financial cost analyses. In the case of legacy systems these analyses can be challenging because the systems may be used across business processes and thus have complex shared costs. In many instances the initial analyses can be done in a spreadsheet; however, we recommend using this opportunity to improve your IT financial processes by adopting more sophisticated modeling and analytic technologies such as those the business uses for financial planning and forecasting.

In today’s economic environment, it’s important to use every available method to reduce unnecessary IT costs so IT can provide the highest level of support for the business. “Keeping the lights on” is one proven area where almost all companies can find savings – sometimes significant ones. Realizing these savings does not have to be difficult. It’s a straightforward process of examining how your organization uses technology tools and applying basic financial analysis to develop a prioritized list of opportunities. Because you will need less to keep the lights on, there can be more potential to build the business, which is what IT is supposed to do.

Judiciously using savings to adopt other technology tools that enable you to extend and transform your information infrastructure can create a cycle in which you both keep the lights on more efficiently and serve the needs of business to greater effect. In the end, your IT budget will be aligned better with your company’s strategic objectives and your department will be viewed as less of a drag on and more of a contributor to your company’s success.

About Ventana Research

Ventana Research is the most authoritative and respected benchmark business technology research and advisory services firm. We provide insight and expert guidance on mainstream and disruptive technologies through a unique set of research-based offerings including benchmark research and technology evaluation assessments, education workshops and our research and advisory services, Ventana On-Demand. Our unparalleled understanding of the role of technology in optimizing business processes and performance and our best practices guidance are rooted in our rigorous research-based benchmarking of people, processes, information and technology across business and IT functions in every industry. This benchmark research plus our market coverage and in-depth knowledge of hundreds of technology providers means we can deliver education and expertise to our clients to increase the value they derive from technology investments while reducing time, cost and risk.

Ventana Research provides the most comprehensive analyst and research coverage in the industry; business and IT professionals worldwide are members of our community and benefit from Ventana Research’s insights, as do highly regarded media and association partners around the globe. Our views and analyses are distributed daily through blogs and social media channels including Twitter, Facebook, and LinkedIn.

To learn how Ventana Research advances the maturity of organizations’ use of information and technology through benchmark research, education and advisory services, visit