Ventana Research Analyst Perspectives provide unique fact-based insights and education on business, industry and technology vendor trends. Each Analyst Perspective presents the voice of the analyst, typically a practice leader and established subject matter expert,  reporting on new developments, the findings of benchmark research, market shifts and best practice insights. Each Analyst Perspective is prepared in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable, actionable news and insights.  

The Strategic Tax Department is a Priority for Longview

Longview’s recent Dialog user group meeting highlighted the company’s continued commitment to providing much needed automation tools for improving tax department performance – tools that enable the tax function to play a more strategic role in the management of a company. The sessions also covered the capabilities contained in the company’s latest release, Longview 7.2 Update 2 and gave customers a detailed product evolution roadmap following their merger with arcplan.

Using a dedicated tax application with a dedicated tax data store to handle direct (income) taxes has three main benefits. First, it enables a company to manage its tax exposure more intelligently, potentially reducing its tax expense. This is important because usually taxes are the second largest expense in a company. Second, the tax-related regulatory environment is becoming more challenging. Taxes paid to individual countries by multinational corporations have come under greater scrutiny by local tax authorities that are suspicious that these companies are manipulating individual countries’ tax laws to eliminate or substantially reduce local tax obligations. In this environment, it will become increasingly important for companies to have global visibility into their country-by-country tax exposure and options. Longview’s tax software can help companies determine how best to allocate income by jurisdiction. Third, by centralizing all global tax-related data in a single data store as well as automating calculations and the management of all tax-related data, Longview’s tax software can enables greater efficiency in the tax provision process. By saving time and ensuring all global tax-related data is consistent, it makes it possible for companies to manage their tax exposures more intelligently and makes it practical for companies to optimize tax payments by jurisdiction.

Corporations made up of more than a handful of legal entities and that operate in multiple direct tax jurisdictions can achieve significant time savings by adopting dedicated tax software rather than using desktop spreadsheets to manage the tax provision process. That’s because companies with these characteristics face challenges that quickly overwhelm spreadsheets. Direct taxes are extremely complicated because national tax codes are complex and ever-changing. Not only do specifics (such as depreciation schedules or inventory expensing rules) vary from one country to the next, but even basic tax concepts can differ. Then there is a “parallel universe” element, because “tax expense” is not the same as “taxes paid.” Tax and finance departments must be able to track and reconcile these differences and allocate tax expense (paid or deferred) accurately to individual business units. Accounting rules specify when tax expense must be recognized, but this can lag when those taxes are actually paid. Timing differences are the reason why very profitable companies pay nothing to tax authorities in some years and write big checks when they are losing money. A company’s tax position can be fluid, so it’s important to be able to work across multiple tax periods. Adjustments to individual entity tax expenses and positions occur frequently, so accurate adjustments and true-ups across tax periods must be easy to calculate, record and retrieve. Behind this are a myriad of specific journal entries to effect changes and the need to assemble a consistent set of account reconciliations to manage the details. The complexity of these processes is a large part of why we have computers.

Yet until recently spreadsheets were the most practical approach because the scale and complexity of the data management required made it difficult for anyone to offer a workable packaged solution. So companies have grown accustomed to using desktop spreadsheets to assemble and analyze data, calculate taxes, generate reports and store the data, analyses, calculations and reports. Our research shows that tax departments rely heavily on desktop spreadsheets for analysis and calculations to support their tax provision and compliance processes. More than half use spreadsheets exclusively and just 10 percent use a dedicated third-party application.

vr_Office_of_Finance_15_tax_depts_and_spreadsheetsOne reason why dedicated tax software makes sense is that, despite talk of reducing the complexity of tax regimes, they are still complex. As corporations grow and expand internationally, their legal entity structure becomes more multifaceted and their source systems for collecting and managing tax data can become fragmented. Unless the tax function is completely centralized, companies that operate in more than a handful of tax jurisdictions can find it hard to coordinate their tax data, calculations and processes. Centralization is not a cure-all, either, as the lack of local presence poses its own tax management issues in coordinating with local operations and finance organizations.

Another reason is that national taxing authorities may be making the tax department’s job more difficult. In 2013, the Organization for Economic Cooperation and Development (OECD) published a report titled “Action Plan on Base Erosion and Profit Shifting,” which describes the challenges national governments face in enforcing taxation in an increasingly global environment with a growing share of digital commerce. The OECD also is providing a forum for member governments to take action (including collective action) to strengthen their tax collection capabilities. Optimizing tax expense across jurisdictions will grow in importance if the OECD’s initiative for global tax reporting gains traction. Companies operating in multiple countries will find it increasingly necessary to understand how best to allocate income between countries to minimize the taxes within the constraints imposed by increased external transparency. Corporations that operate globally will need to be able to gauge how best to manage their tax exposure in an environment where decisions about transfer pricing and corporate organization will require greater care and forethought than today.

Desktop spreadsheets are not well suited to any repetitive collaborative enterprise task or as a corporate data store. They are a poor choice for managing taxes because they are error-prone, lack transparency, are difficult to use for data aggregation, lack controls and have a limited ability to handle more than a few dimensions at a time. Data from corporate sources, such as ERP systems, may have to be adjusted and transformed to put this information into its proper tax context – for example, performing allocations or transforming the data so that it reflects the tax-relevant legal entity structure rather than corporate management structure. Doing this manually in desktop spreadsheets is time-consuming and prone to errors. Moreover, in desktop spreadsheets it is difficult to parse even moderately complex nested formulas or spot errors and inconsistencies. Pivot tables have only a limited ability to manage key dimensions (such as time, location, business unit and legal entity) in performing analyses and reporting. As a data store, spreadsheets may be inaccessible to others in the organization if they are kept on an individual’s hard drive. Spreadsheets are rarely documented well, so it is difficult for anyone other than the creator to understand their structure and formulas or their underlying assumptions. The provenance of the data in the spreadsheets may be unclear, making it difficult to understand the source of discrepancies between individual spreadsheets as well as making audits difficult.  Companies are able to deal with spreadsheets’ inherent shortcomings only by spending more time than they should assembling data, making calculations, checking for errors,  creating reports and auditing the results.

Applications for managing taxes such as Longview Tax are making provisioning and reporting faster, more efficient and more reliable. One of the most important elements of such a system is the tax data warehouse that is at the core of Longview Tax. Statutory and tax accounting are not the same, so it’s important for companies to keep a tax-sensitized record of transactions and balances. This speeds calculations, analysis and reporting and improves the accuracy and dependability of the tax department’s work product. By substantially reducing or eliminating the use of desktop spreadsheets, it increases the transparency of the tax process. Segregating tax data from the rest of a company’s enterprise data is also essential because of the need to keep this information in an “as-was” state. Corporations buy and sell business units as well as reorganize on an ongoing basis. They then adjust their financial and management accounting systems to reflect the current needs of the business. Tax authorities, however, are concerned with the individual legal entities that make up a corporation as they existed in a given fiscal period. Maintaining all tax data together – including all of the minutiae of individual account adjustments and true-ups within and between periods – facilitates tax audit defense.

Beyond handling data better, Longview Tax also makes people working in tax departments much more efficient and the results of their work more accurate and transparent. It ensures that the process of capturing the difference between book (statutory) accounting and tax accounting is efficient, accurate and consistent. It enables corporations to standardize processes and reporting to simplify training and streamline reviews and audits. It’s able to use the dimensional capabilities of the system to enable a company to instantly and consistently create and publish reports that conform to the requirements of different taxing authorities – including, for example, different currencies and different depreciation methods for the exact same assets. By eliminating the need for people to perform repetitive tasks that require little judgment, it enables the department to increase its value and make it more strategic to a corporation.

Longview Tax provides bidirectional integration with a company’s ERP system, data warehouses and tax compliance software to ensure the fidelity of data at every step in tax-related processes. It gives administrators the ability to define and monitor tax department tasks to ensure all steps are performed and alerts them of delays. It provides an Excel add-in to give users a familiar environment in which to work and an ability to do ad-hoc calculations while eliminating almost all of the defects associated with desktop spreadsheets. Data and formulas are stored in a central database, ensuring quality and consistency. All of this promotes accuracy, reduces the risk of errors in calculations and presentations and can substantially cut the amount of time people in tax departments spend checking and reconciling their spreadsheets, which also boosts efficiency. And because Longview offers software for statutory consolidation, disclosures and external financial reporting, the Tax application supports the creation of required financial statement footnotes and tax disclosures.

Just improving the efficiency of a tax department can justify investing in a tax application because of the time it saves, greater accuracy and increased transparency. Companies may also find that by speeding the process of assembling tax data and performing the necessary calculations they have more time to consider their options on where and when they report income. I recommend that any company operating in more than a handful of tax jurisdictions should consider using a dedicated application for tax analysis, provisioning and reporting and that they consider Longview Tax for that role.

Regards,

Robert Kugel – SVP Research


Is NPS the Best Measure of Customer Experience?

Recently my colleague Tony Cosentino wrote an analyst perspective asserting that big data analytics will displace net promoter score (NPS) for more effectively measuring the entire customer experience. This prompted a response from Maxie Schmidt-Subramanian, asserting that big data and NPS aren’t the only ways to measure customer experience success. The main point of Tony’s piece, as I interpret it, is that NPS is just a number, but big data analytics can reveal much more about customer behavior and intentions, and it can link these to business outcomes. On the other hand Maxie argues that whether or not companies use NPS, when it comes to measuring the customer experience, they rely too much on surveys and no one metric does the entire job. While to a large extent I agree with both arguments, from a business perspective I don’t think either addresses three very important questions. The first is what actually is the customer experience? Second, how should it be measured? And third, what is the best use of big data in relation to customer experience?

I recently wrote about how to deliver EPIC customer experiences. This acronym includes four elements that go a long way toward defining a superior customer experience: It must be Easy (in availability of channels at times of the customer’s choice, and in use of technology), Personalized, In context (reflecting previous interactions) and above all Consistent (presenting the same timely information regardless of channel, whether assisted or self-service). That said, I believe that what is most important, for both customer and company, is thevr_Customer_Analytics_02_drivers_for_new_customer_analytics outcome of the interaction. Was the problem resolved to the customer’s satisfaction? Did the caller find and purchase the products or services being sought? Of course there are other considerations such as the cost of the interaction and the customer’s subsequent value to the organization.

Regarding the second question, various metrics are useful to assess different outcomes and the true customer experience. Our benchmark research into next-generation customer analytics illustrates this point, showing that companies use on average 11 metrics to assess customer-related activities: Among the most widely used, three are financial (adherence to budget, customer service costs and customer profitability), five are process-oriented (including call outcomes, performance vs. service level agreement and agent quality scores), and three are customer-specific (customer satisfaction, cost to serve and lifetime value). Perhaps in contrast to popular opinion, NPS ranked only fifth among customer-specific metrics. Our research also finds that improving customer experience is a top priority and driver for improving in 63 percent of organizations.  Overall the results strongly suggest that most companies are undecided on how to measure the customer experience, but they  seem to agree one metric isn’t enough.

That brings us to big data, and to analytics applied to it. Companies, especially large ones serving consumers, have always had a lot of customer data, including from CRM, ERP, billing and other business applications to interaction-related data in call recordings, email letters and other forms. Recently the volume and variety both have increased significantly because companies often have web, email, IVR recordings, text records, social media surveys, Web scripts, chat scripts, instant vr_Customer_Analytics_09_technology_used_for_customer_analyticsmessages, social media posts, video recordings and output from mobile apps. But most companies can’t do much with all this data. Our benchmark research into next-generation customer analytics finds that the most common tools used to produce metrics, reports and analysis are spreadsheets and general-purpose business intelligence tools. While each of these has its uses, both require considerable manual effort, and neither can process unstructured data (such as voice, text and events) or expose insights from the content; they can’t, for example, determine customer satisfaction because of what was said. Nor do they make it easy to gather data from multiple sources; for example, before purchasing a new product, a customer might have had multiple visits to the company website, chat sessions with contact center agents, phone calls with people in several business groups, filled out feedback surveys and posted a comment on social media complaining how difficult the process had been. To achieve all three, systems must be able to link data from multiple sources and apply data, speech and text analytics; these are common capabilities of several big data analytics products. The foundation of big data is important as it was found to be the second most important technology category for customer analytics in 60 percent of companies after collaboration and 44 percent of companies are using it today according to our next generation customer analytics research. My main point is that big data is ultimately not just about volumes and speed of change but about understanding the data companies have and putting the information to use to deliver the desired outcomes.

My most recent research studies show that the majority of companies run their communication channels independently of each other and business groups chase their own goals so that there is little collaboration between them; these disconnections are among the reasons most customer experiences are far from EPIC. To improve we recommend that companies take the following steps. First and foremost in a multichannel world is understanding actual customer journeys, which I have written about. These journeys cross channels and business groups, extend throughout the customer life cycle and differ for individual products and services. Big data is needed to ingest and process the great volumes and many types of data involved, including all the data associated with a named customer, and analytics is necessary to produce analysis and metrics. These tools can help companies understand the outcomes of all those journeys and identify ways to improve them. In addition companies can benefit from using predictive analytics to examine past journeys and use them and scenarios to predict likely outcomes of current or future journeys; for example, if customers that go down a certain path often stop being customers, the company should find ways to influence them to take more productive paths.

Secondly, companies should rethink the metrics they use. Our customer analytics research finds that companies often claim to be trying to improve one aspect of service – for example, customer satisfaction – but measure another – say, average handling time. Once again metrics should align with desired outcomes: If cost control is important, measuring handling times makes sense as these have a direct correlation of costs, but if customer satisfaction is most important find metrics such as customer satisfaction measured over time and customer vr_Customer_Analytics_03_key_benefits_of_customer_analyticsvalue that relate to it. Our next generation customer analytics research finds that the largest benefit from analytics is to improve the customer experience according to over half (55%) of organizations.

Thus it is clear that companies need a balanced set of metrics that are directly related to what they are trying to achieve and are shared across the organization. The last point is very important and ties to Maxie’s point that “humans need a concept to rally around.” For example, I know of a company in which everyone’s compensation depended to some extent on customer satisfaction scores. Leaving aside whether they were measuring this objectively, it stopped employees from doing things that might result in bad customer experiences and thus lower customer satisfaction scores; one obvious example is selling customers the wrong product. One metric I endorse is customer lifetime value. This is an outcome metric that addresses both sides of the cost and revenue equation, is a strong indicator of customer loyalty and reflects both customer experience and employee performance.

To build on Tony’s and Maxie’s analyses let me finish with four observations:

  • The right analytics, whether called big data or not, can reveal more about the customer experience than any metric. It can also precisely calculate metrics such as lifetime value that require multiple data sources.
  • It is likely that companies will go on using surveys, albeit using more channels, as a means of gaining feedback from customers. However, companies can gain more value by using speech and text analytics to gain broader insights that reflect customers’ feelings and predict their likely actions.
  • Companies should adopt real-time or near-real-time customer journey maps, showing outcomes and including predictive capabilities, to help manage and improve the customer experience.
  • There is no golden customer experience metric; customer lifetime value is probably the closest. So it is necessary to use multiple metrics, which companies should share more and use to drive action as there is no point in metrics for metrics’ sake.

Customer experience has become a key differentiator for many companies. However getting it right is not easy. So I recommend that organizations take into account my observations as they strive to create more loyal and thus more valuable customers.

Regards,

Richard J. Snow

VP & Research Director


Skills Gap Challenges Potential of Predictive Analytics

The Performance Index analysis we performed as part of our next-generation predictive analytics benchmark research shows that only one in four organizations, those functioning at the highest Innovative level of performance, can use predictive analytics to compete effectively against others that use this technology less well. We analyze performance in detail in four dimensions (People, Process, Information and Technology), and for predictive analytics we find that organizations perform best in the Technology dimension, with 38 percent reaching the top Innovative level. This is often the case in our analyses, as organizations initially perform better in the details of selectingvr_NG_Predictive_Analytics_performance_06_dimensions and managing new tools than in the other dimensions. Predictive analytics is not a new technology per se, but the difference is that it is becoming more common in business units, as I have written.

In contrast to organizations’ performance in the Technology dimension, only 10 percent reach the Innovative level in People and only 11 percent in Process. This disparity uncovered by the research analysis suggests there is value in focusing on the skills that are used to design and deploy predictive analytics. In particular, we found that one of the two most-often cited reasons why participants are not fully satisfied with the organization’s use of predictive analytics is that there are not enough skilled resources (cited by 62%). In addition, 29 percent said that the need for too much training or customized skills is a barrier to changing their predictive analytics.

The challenge for many organizations is to find the combination of domain knowledge, statistical and mathematical knowledge, and technical knowledge that it needs to be able to integrate predictive analytics into other technology systems and into operations in the lines of business, which I also have discussed. The need for technical knowledge is evident in the research findings on the jobs held by individual participants: Three out of four require technical sophistication. More than one-third (35%) are data scientists who have a deep understanding of predictive analytics and its use as well as of data-related technology; one-fourth are data analysts who understand the organization’s data and systems but have limited knowledge of predictive analytics; and 16 percent described themselves as predictive analytics experts who have a deep understanding of this topic but not of technology in general. The research also finds that those most often primarily responsible for designing and deploying predictive analytics are data scientists (in 31% of organizations) or members of the business intelligence and data warehouse team (27%). This focus on business intelligence and data warehousing vr_NG_Predictive_Analytics_16_why_users_dont_produce_predictive_analysesrepresents a shift toward integrating predictive analytics with other technologies and indicates a need to scale predictive analytics across the organization.

In only about half (52%) of organizations are the people who design and deploy predictive analytics the same people who utilize the output of these processes. The most common reasons cited by research participants that users of predictive analytics don’t produce their own analyses are that they don’t have enough skills training (79%) and don’t understand the mathematics involved (66%). The research also finds evidence that skills training pays off: Fully half of those who said they received adequate training in applying predictive analytics to business problems also said they are very satisfied with their predictive analytics; percentages dropped precipitously for those who said the training was somewhat adequate (8%) and inadequate (6%). It is clear that professionals trained in both business and technology are necessary for an organization to successfully understand, deploy and use predictive analytics.

To determine the technical skills and training necessary for predictive analytics, it is important to understand which languages and libraries are used. The research shows that the most common are SQL (used by 67% of organizations) and Microsoft Excel (64%), with which many people are familiar and which are relatively easy to use. The three next-most commonly used are much more sophisticated: the open source language R (by 58%), Java (42%) and Python (36%). Overall, many languages are in use: Three out of five organizations use four or more of them. This array reflects the diversity of approaches to predictive analytics. Organizations must assess what languages make sense for their uses, and vendors must support many languages for predictive analytics to meet the demands of all customers.

The research thus makes clear that organizations must pay attention to a variety of skills and how to combine them with technology to ensure success in using predictive analytics. Not all the skills necessary in an analytics-driven organization can be combined in one person, as I discussed in my analysis of analytic personas. We recommend that as organizations focus on the skills discussed above, they consider creating cross-functional teams from both business and technology groups.

Regards,

Tony Cosentino

VP and Research Director


 

 Copyright © 2015 Ventana Research All Rights Reserved :: Privacy Statement :: Contact Us ::