Ventana Research Analyst Perspectives provide unique fact-based insights and education on business, industry and technology vendor trends. Each Analyst Perspective presents the voice of the analyst, typically a practice leader and established subject matter expert,  reporting on new developments, the findings of benchmark research, market shifts and best practice insights. Each Analyst Perspective is prepared in accordance with Ventana Research’s strict standards for accuracy and objectivity and reviewed to ensure it delivers reliable, actionable news and insights.  

Denodo Makes Data Virtualization Relevant to Big Data and Analytics

Data virtualization is not new, but it has changed over the years. The term describes a process of combining data on the fly from multiple sources rather than copying that data into a common repository such as a data warehouse or a data lake, which I have written about. There are many reasons for an organization concerned with managing its data to consider data virtualization, most stemming from the fact that the data does not have to be copied to a new location. It could, for instance, eliminate the cost of building and maintaining a copy of one of the organization’s  big data sources. Recognizing these benefits, many database and data integration companies offer data virtualization products. Denodo, one of the few independent, best-of-breed vendors in this market today, brings these capabilities to big data sources and data lakes.

Google Trends presents a graphic representation of the decline of the popularity of the term data federation and the rise in popularity VirtualizationTrendingof the term data virtualization over time. The change in terminology corresponds with a change in technology. The industry has evolved from a data federation approach to today’s cost-based optimization approach. In a federated approach, queries are sent to the appropriate data sources without much intelligence about the overall query or the cost of the individual parts of the federated query. Each underlying data source performs its portion of the workload as best it can and returns the results. The various parts are combined and additional post-processing performed if necessary, for example to sort the combined result set.

Denodo takes a different approach. Its tools consider the costs of each part of the individual query and evaluate trade-offs. As the saying goes, there’s more than one way to skin a cat; in this case there’s more than one way to execute a SQL statement. For example, suppose you wish to create a list of all sales of a certain set of products. Your company has 1,000 products (maintained in one system) and hundreds of millions of customer transactions (maintained in another system). The federated approach would bring both data sets to the federated system, join them and then find the desired subset of products. An alternative would be to ship the table of 1,000 products to the system that holds the customer transactions, load it as a temporary table and join it to the customer transaction data to identify the desired subset before sending the product data back to its source. Today’s data virtualization evaluates the costs in time of the two alternatives and selects the one that would produce the result set the fastest.

Data virtualization can make it easier, andvr_BDI_16_importance_of_virtualization therefore faster, to set up access to data sources in an organization. Using Denodo users connect to existing data sources, which become available as a virtual resource. In the case of data warehouses or data lakes, this virtual representation is often referred to as a logical data warehouse or a logical data lake. No matter how hard you work to consolidate data into a central repository, there are often pieces of data that have to be combined from multiple data sources. We find that such issues are common. In our big data integration benchmark research one-fourth (26%) of organizations said that data virtualization is a key activity for their big data analytics, yet only 14 percent said that they have adequate data virtualization capabilities.

Not all the work is eliminated by data virtualization. You must still design the logical model for the data that you want to provide, such as which tables and which columns to include, but that’s all. Virtualization eliminates load processes and the need to update the data. In the case of big data, there are no extra clusters to set up and maintain. The logical data warehouse or data lake uses the security and governance system already in place. As a result, users can avoid some of the organizational battles about data access since the “owner” of the data continues to maintain the rights and restrictions on the data. Our research shows that organizations that have adequate data virtualization capabilities are more often satisfied with the way their organization manages big data than are organizations as a whole (88% vs. 58%) and are more confident in the data quality of their big data integration efforts (81% vs. 54%).

In its most recent release, version 6.0, Denodo enhanced its cost-based query optimizer for data virtualization. Many of the optimizer’s features would be found in any decent relational database management system, but the challenge becomes greater when the underlying resources are scattered among multiple systems. To address this issue Denodo collects and maintains statistics about the various data sources that are evaluated at run time to determine the optimal way to execute queries. The product offers connectivity to a variety of data sources, both structured and unstructured, including Hadoop, NoSQL, documents and websites. It can be deployed on premises, in the cloud using Amazon Web Services or in a hybrid configuration.

Performance can be a key factor in user acceptance of data virtualization; users will balk if access is too slow. Denodo has published some benchmarks showing that performance of its product can be nearly identical to accessing data loaded into an analytical database. I never place much emphasis on vendor benchmarks as they may or may not reflect an actual organization’s configuration and requirements. However, the fact that Denodo produces this type of benchmark indicates its focus on minimizing the performance overhead associated with data virtualization.

When I first looked at Denodo, prior to the 6.0 release, I expected to see more optimization techniques built into the product. There’s always room for improvement, but with the current release the company has made great strides and addressed many of these issues. In order to maximize the software’s value to customers, I’d like to see the company invest in developing more technology partnershipsVR2015_InnovationAwardWinner with providers of data sources and analytic tools. Users would also find it valuable if Denodo could help manage and present consolidated lineage information. Not only do users need access to data, they need to understand how data is transformed both inside and outside Denodo.

If your organization is considering data virtualization technology, I recommend you evaluate Denodo. The company won the 2015 Ventana Research Technology Innovation Award for Information Management, and its customer Autodesk won the 2015 Leadership Award in the Big Data Category. If your organization is deluged with big data but is not considering data virtualization, it probably should be. As our research shows, it can lead to greater satisfaction with and more confidence in the quality of your data.

Regards,

David Menninger

SVP & Research Director

Follow Me on Twitter @dmenningerVR and Connect with me on LinkedIn.


Upstream Works Delivers Omnichannel Experience for Customers

Since I last wrote about Upstream Works it has expanded its focus on contact center agent efficiency and effectiveness to include omnichannel customer experience. Each of its core products has undergone a number of developments. Its main product now is Upstream Works for Finesse, which it classifies as a smart agent desktop. This is a desktop application that enables users of contact center systems to access the information and systems they need to resolve interactions, as well as prompting the user with next best steps to complete the interaction efficiently and effectively. Upstream Works has a close working agreement with Cisco so the product is only available for users of the Cisco Finesse product.

Upstream Works for Finesse achieves these objectives through a number of capabilities. It provides an intuitive user interface that is the same no matter the type of interaction or the channel of engagement. The user’s workspace can be tailored to the type of interaction and be set up to navigate from one system to another depending on the interaction flow. The product provides various ways to connect with any type of system, whether it’s a business application or a technology system managing a specific communication channel. It can extract data from one system and deliver it to any other system integrated into the desktop, making it possible to access all customer information and connect multiple communication systems.

Combining systems enables users to connect different customer identifiers with a single customer, which in turns supports the production of customer journey maps. Some of this mapping is automated by connecting data across systems, while some requires manual input. The ability to extract data from multiple systems enables interactions to be personalized and placed in the context of previous interactions; for example, suppose that a customer began by searching the home page, engaged in a chat session and made a phone call, and then the customer and agent shared some Web pages. Upstream Works for Finesse thus improves the agent experience; providing information to personalize the interaction and provide consistent responses and information improves the customer experience. Overall interaction handling becomes more efficient and effective, thus helping to meet operational targets and produce desired business outcomes.

These capabilities are underpinned by a single set of management and reporting tools. The management tools include capabilities for all aspects of a contact center, from managing agents, teams, tasks and skills to setting up routing rules for all types of interactions. Its integration capabilities and internal data mart support reporting and analysis of all aspects of interactions handling, and its predictive analytics supports prompting of agents with information or the next best action.

A smart desktop is one of the six technologies I identified that organizations need in order to provide customers with easy-to-use, personalized, in-context and consistent experiences. This is especially true for companiesvr_NGCE_15_supporting_multiple_channels that support multiple channels of engagement. Our benchmark research into next-generation customer engagement shows that the three issues organizations most often struggle with in this regard are integration of systems (49%), channels managed as silos (47%) and inconsistency of responses (33%). To overcome each of these a smart agent desktop is a pragmatic option. It doesn’t integrate systems per se but enables access to all systems and can pull and push data to as many systems as required. It also supports a common set of rules to route interactions and a common user interface to handle any type of interaction, and it has built-in rules and capabilities to present the same information regardless of the channel or the user.

I therefore recommend to all Cisco users who want to improve agent performance and customer experience that they evaluate how Upstream Works for Finesse can help those efforts.

Regards,

Richard J. Snow

VP & Research Director, Customer

Follow Me on Twitter and Connect with me on LinkedIn


Unit4 Adds Financial Performance Management with Prevero Acquisition

Unit4, a Netherlands-based vendor of financial management software focused mainly on midsize companies, recently acquired prevero, a German vendor of performance management and business intelligence software. The acquisition reflects a convergence of transactional and analytic business applications, which I have written about. ERP and financial management software vendors increasingly are adding analytic capabilities – especially in financial performance management (FPM) – to the core functions of transaction processing and accounting to broaden the scope of their offerings.

For users of finance software the addition of analytic capabilities makes it easier to obtain useful information directly from their ERP system. Our Office of Finance benchmark research finds that companies are split on information availability: Half (50%) of participants said that it’s easy or very easy to get information from their ERP system, but nearly as many (48%) said it isn’t easy. Onevr_NG_Finance_Analytics_12_timely_data_supports_agility benefit of having analytics built into a transaction system such as ERP or financial management is that it automates and therefore often speeds up the transformation of data that’s collected into useful, digestible information. Our next-generation finance analytics research finds a tangible business benefit in doing this. Nearly all (86%) companies that said they have up-to-date data also said they are able to respond to changes in business conditions in a coordinated fashion, compared to 38 percent in companies that said that most (but not all) data is current and just 19 percent of those whose data is less than up-to-date. From the vendors’ perspective, the integration of the two categories helps them increase revenue from customers, differentiate their offerings in a highly commoditized market and enhance the “stickiness” of the software by increasing the number of process and user touch points in customer organizations.

Midsize companies have essentially the same capability requirements as larger enterprises, but they typically have less money and fewer IT resources to acquire and maintain business software. Vendors that focus on this market segment have sought to respond to this situation by enabling specific types of businesses to cut implementation times and simplify maintenance. In Unit4’s case these categories include business and professional services, higher education, nonprofit and government. Cloud-based applications sold as a service address the cost and IT resource challenges better than on-premises systems by cutting the initial investment and eliminating the need for internal staff to manage the software. Integrating FPM software adds substantial value because it greatly simplifies the process of getting useful information out of the ERP system (in the forms of reports, up-to-the-minute dashboards and scorecards) as well as planning, budgeting and statutory consolidations that interact with the transactions systems. This is important for business and professional services companies, which need to minimize administrative staff, and to higher education, non-profits and government agencies, which have limited operating budgets and historically have had a hard time attracting IT talent.

The Prevero acquisition is a strategic step for Unit4 since it is likely the most cost-effective approach to adding analytics (including purpose-built predictive analytics) and FPM to its financial management offerings. This purchase has the potential of increasing the company’s annual recurring revenue for new and existing customers, and in any case, it will increasingly become necessary for any vendor to be competitive in the ERP and financial management categories. Prevero’s project management capabilities also are a good fit for Unit4’s professional services vertical and a useful feature for bridging annual budgeting and long-term planning, in which projects and major initiatives can span multiple fiscal years.

Practically speaking, however, Unit4 faces several challenges in challenges in absorbing prevero, beginning with integration of the software. Initially it can be easier to achieve a workable integration of cloud-based products than on-premises ones, but shortcuts may not eliminate the need for more comprehensive changes over the longer term. This is especially important for creating offerings tailored to the specific needs of targeted industries. For example, Prevero’s user interface is adequate today but will require significant updating to remain competitive. Although the two product lines are a great fit, financial management and FPM have different audiences in the buying process (even if the CFO and the controller are important in making the ultimate decision for both). Additionally, Unit4 will need to devise and implement sales training and marketing programs for effective cross-selling. Furthermore, part of Unit4’s growth strategy is to expand its presence in the North American market, but Prevero’s customers are mainly in Europe, and there are subtle but important cultural differences between the two (for instance, in attitudes toward the budgeting process) that will have to be addressed in localization of the software.

Nevertheless, bringing integrated FPM and analytics capabilities to Unit4’s financial management software can benefit both current and potential customers. I recommend that they monitor the product roadmap closely to understand when specific capabilities will become available. Prevero’s customers are likely to benefit from the investments that Unit4 will make in the software generally and the user experience in particular. At the same time, there are always uncertainties when any software company is acquired. Interested parties should watch how Unit4 addresses them over the coming year by clearly communicating its intentions and progress to these objectives.

Regards,

Robert Kugel

Senior Vice President Research

Follow Me on Twitter @rdkugelVR and

Connect with me on LinkedIn.


 

 Copyright © 2016 Ventana Research All Rights Reserved :: Privacy Statement :: Contact Us ::