The Ventana Research Blog is the hottest place to get the inside scoop on the business, IT, technology and industry issues. Routinely, Ventana Research, will post new entries on hot topics and issues that you should know. We encourage you to submit comments -- so you and other members can collaborate. This blog is only available for members of the Ventana Research Community to comment.

New Generation of Learning Management Systems

Learning is an integral component of human capital management, and a new generation of learning management systems advances learning in organizations around the world. These systems have evolved over the years from a classroom scheduling tool that facilitated instructor-led and classroom training into an array of enterprise applications that deliver and track various types of training. Recently new technologies, such as business analytics, cloud computing, social collaboration, and mobile technology have become part of the learning management process. To assess the impacts of this ongoing shift, Ventana Research is conducting benchmark research on how organizations are implementing and using this new generation of systems.

Until now learning management systems have been designed to take defined content from a few instructional sources and instructors and distribute it through an organization according to a defined process flow. While that approach will always have a place to accomplish some basic learning objectives, customers are now seeking to apply to learning new tools that enable collaboration in the learning process, access through mobile devices and embedded advanced analytics that help evaluate effectiveness of learning programs.

The previous generations of learning management systems were limited in flexibility. They could use only a few types of content and lacked the ability to share knowledge widely and easily. In addition, providing access to learning content only through a Web browser limits access to content to those with laptops and network access. This excludes employees and managers who travel frequently from having convenient access to learning content and related tasks. Now advances in mobile technology for learning management are coming to market.

Other processes in human capital management also are affected vr_HCA_06_technology_for_human_capital_analytics_improvementby these next-generation technologies, as several instances of our benchmark research show. In our social collaboration research more than half (57%) of participants said social tools are important in recruiting. In research on human capital analytics, collaboration is the most sought-after technology enhancement, by more than half of organizations. This also is the case in our next-generation workforce management research. It is clear that business is making collaboration a priority for HCM in general, and learning management will adopt it as well. Social collaboration provides several ways for employees to engage with others, empowering contact with knowledge sources, tracking activity streams, broadcasting information and using video for live or recorded sharing. Even newer methods measure outcomes through gamification techniques that rate individuals’ activity and provide rewards that enhance their stature in the organization. These approaches can augment more formal learning and encourage people to embrace it more.

vr_HCA_07_ease_of_collecting_data_adds_to_satisfactionIn addition, our human capital analytics benchmark shows that organizations are employing more complex learning metrics using advanced analytics tools within learning systems. The learning metrics that most companies are tracking today are performance after learning has occurred (which 64% do) and retention rates, which correlate learning to retention of workers (39%). These and other sophisticated metrics require the ability to combine data from other systems with learning management data. The research shows that companies that have human capital analytics systems to facilitate collection of data from multiple systems are more satisfied (86%) with their system than are those whose system lacks such capabilities.

The objectives of our new learning management benchmark research will include interest levels in the different next-generation technologies, rates of adoption and specific organizational areas where more investment is occurring or desired. It also will explore how different segments rate and use this technology, comparing employees to managers and executives, companies of various sizes and industry sectors. It will assess intentions for integrating next-generation learning management with other HCM applications, particularly talent and workforce management, and how that may change from past learning management efforts. Learning plays an integral role from onboarding new employees through career development and in compliance to policies and regulations. Evaluating these and other related questions can reveal where adopters may find the greatest return on investment in advanced learning management.

Overall this benchmark research will provide valuable insights for organizations using or planning to use advanced learning management systems. The early insights this research uncovers into best practices and benefits will help organizations save time and resources and make shrewd technology investments. With the global economy growing and companies competing for the best available talent, understanding how advanced learning management systems can help retain talent, share knowledge more effectively and increase productivity can be a competitive differentiator. Please look here for my analysis of advancements and key insights from this benchmark research.

Regards,

Stephan Millard

VP & Research Director


Finance Analytics Requires Data Quality

Our research consistently finds that data issues are a root cause of many problems encountered by modern corporations. One of the main causes of bad data is a lack of data stewardship – too often, nobody is responsible for taking care of data. Fixing inaccurate data is tedious, but creating IT environments that build quality into data is far from glamorous, so these sorts of projects are rarely demanded and funded. The magnitude of the problem grows with the company: Big companies have more data and bigger issues with it than midsize ones. But companies of all sizes ignore this at their peril: Data quality, which includes accuracy, timeliness, relevance and consistency, has a profound impact on the quality of work done, especially in analytics where the value of even brilliantly conceived models is degraded when the data that drives that model is inaccurate, inconsistent or not timely. That’s a key finding of our finance analytics benchmark research.

vr_NG_Finance_Analytics_04_accurate_data_is_key_for_finance_analyticsA main requirement for the data used in analytics is that it be accurate because accuracy affects how well finance analytic processes work. One piece of seemingly good news from the research is that a majority of companies have accurate data with which to work in their finance analytics processes. However, only 11 percent said theirs is very accurate, and there’s a big difference between accurate enough and very accurate. The degree of accuracy is important because it correlates with, among other things, the quality of finance analytics processes and the agility with which organizations can respond to and plan for change.

Although almost all (92%) of the companies that have very accurate data also have a process that works well or very well, that assessment drops to 43 percent of companies that said their data is just accurate. Even in small doses, bad data has an outsized impact on finance analytic processes. Inaccuracies, inconsistencies and not comparable data can seriously gum up the works as analysts search for the source of the issue and then try to resolve it. As issues grow, dissatisfaction with the process increases. Just 22 percent of those with somewhat accurate data and none of the companies with data that is not accurate said their company has a process that works well or very well.

To be truly useful for business, analytics provided to executives, managers and other decision-makers must be fresh. The faster a company can deliver the assessments and insight as to what just happened, the sooner company can respond to those changes. Almost all (85%) companies with very accurate data said they are able to respond immediately or soon enough to changes in business or market conditions, but only 35 percent of those with accurate data and just 24 percent of those with somewhat accurate data are able to do so.

Moreover, having data that is timely enables companies to react in a coordinated fashion as well as quickly. Companies that are able to operate in a coordinated fashion are usually more successful in business than those that are somewhat coordinated in the same way that a juggler who is somewhat coordinated drops a lot of balls. Almost all (86%) companies whose data is all up-to-date said they are able to react to change in a coordinated or very well coordinated fashion, compared to just 38 percent of those whose data is mostly up-to-date and 19 percent that have a significant percentage of stale data. Three-fourths (77%) of companies that have very accurate data are able to respond to changes in a coordinated or very well coordinated fashion, but just one-third (35%) of those with accurate data and 14 percent with somewhat accurate data are able to accomplish this.

Speed is essential in delivering metrics and performance indicators if they are to be useful for strategic decision-making, competitive positioning and assessing performance. Companies that can respond sooner to opportunities and threats are more able to adjust to changing business conditions. The research finds that fewer than half (43%) of companies are able to deliver important metrics and performance indicators within a week of a period’s end – that is, soon enough to respond to an emerging opportunity or threat.

One way to speed up the delivery of analytics is to have analysts vr_NG_Finance_Analytics_09_too_much_time_to_prepare_datafocus their time on the analytics. But the research shows that not many do: A majority of analysts spend the biggest chunk of their time dealing with data-related issues rather than on the analysis itself. Two-thirds (68%) of participants reported that they spend the most time dealing with the data used in their analytics – waiting for it, reviewing it for quality and consistency or preparing it for analysis. Only one-fourth (28%) said their efforts focus most on analysis and trying to determine root causes, which are the main reasons for doing the analysis in the first place. In other words, in a majority of companies, analysts don’t spend enough time doing what they are valued and paid for.

The results also show that there are negative knock-on effects of spending time on data-related tasks rather than on analysis. More than half (56%) of the companies that spend the biggest part of their time working on analytics can deliver metrics and indicators within a business week, compared to just one-third (36%) of those that spend the biggest part of the time grappling with data issues. Having high-quality, timely and accessible data therefore is essential to reaping the benefits of finance analytics.

dataData issues diminish productivity in every part of a business as people struggle to correct errors or find workarounds. Issues with data are a man-made phenomenon, yet companies seem to treat bad data as a force of nature like a tornado or an earthquake that’s beyond their control to fix. Our benchmark research on information management suggests that inertia in tackling data issues is more organizational than technical. Companies simply do not devote sufficient resources (staff and budget) to address this ongoing issue. One reason may be because the people who must confront the data issues in their day-to-day work fail to understand the connection between these and getting the results from analytics that they should.

Excellent data quality is the result of building quality controls into data management processes. Our research finds a strong correlation between the degree of data quality efforts in finance analytics and the quality of the finance department’s analytic processes and output, and ultimately its timeliness and its value to the company. Corporations generally – and finance organizations in particular – must pay closer attention to the reliability of the data they use in their analytics. The investment in having better data will pay off in better analytics.

Regards,

Robert Kugel – SVP Research


Big Data Analytics Require Best Practices in Using Technology

Organizations should consider multiple aspects of deploying big data analytics. These include the type of analytics to be deployed, how the analytics will be deployed technologically and who must be involved both internally and externally to enable success. Our recent big data analytics benchmark research assesses each of these areas. How an organization views these deployment considerations may depend on the expected benefits of the big data analytics program and the particular business case to be made, which I discussed recently.

According to the research, the most important capability of big data analytics is predictive analytics (64%), but among companies vr_Big_Data_Analytics_08_top_capabilities_of_big_data_analyticsthat have deployed big data analytics, descriptive analytic approaches of query and reporting (74%) and data discovery (64%) are more readily available than predictive capabilities (57%). Such statistics may be a function of big data technologies such as Hadoop, and their associated distributions having prioritized the ability to run descriptive statistics through standard SQL, which is the most common method for implementing analysis on Hadoop. Cloudera’s Impala, Hortonworks’ Stinger (an extension of Apache Hive), MapR’s Drill, IBM’s Big SQL, Pivotal’s HAWQ and Facebook’s open-source contribution of Presto SQL all focus on accessing data through an SQL paradigm. It is not surprising then that the technology research participants use most for big data analytics is business intelligence (75%) and that the most-used analytic methods — pivot tables (46%), classification (39%) and clustering (37%) — are descriptive and exploratory in nature. Similarly, participants said that visualization of big data allows analysts to perform faster analysis (49%), understand context better (48%), perform root-cause analysis (40%) and display multiple result sets (40%), but visualization does not provide more advanced analytic capabilities. While various vendors now offer approaches to run advanced analytics on big data, the research shows that in terms of big data, organizational capabilities still revolve around more basic analytic access.

For companies that are implementing advanced analytic capabilities on big data, there are further analytic process considerations, and many have not yet tackled those. Model building and model deployment should be manageable and timely, involve specialized personnel, and integrate into the broader enterprise architecture. While our research provides an in-depth look at adoption of the different types of in-database analytics, deployment of advanced analytic sandboxes, data mining, model management, integration with business processes and overall model deployment, that is beyond the topic here.

Beyond analytic considerations, a host of technological decisions vr_Big_Data_Analytics_13_advanced_analytics_on_big_datamust be made around big data analytics initiatives. One of these is the degree of customization necessary. As technology advances, customization is giving way to more packaged approaches to big data analytics. According to our research, the majority (54%) of companies that have already implemented big data analytics did custom builds using big data-specific languages and interfaces. The most of those that have not yet deployed are likely to purchase a dedicated or packaged application (44%), followed by a custom build (36%). We think that this pre- and post-deployment comparison reflects a maturing market.

The move from custom approaches to standardized ones has important implications for the skills sets needed for a big data vr_Big_Data_Analytics_14_big_data_analytics_skillsanalytics initiative. In comparing the skills that organizations said they currently have to the skills they need to be successful with big data analytics, it is clear that companies should spend more time building employees’ statistical, mathematical and visualization skills. On the flip side, organizations should make sure their tools can support skill sets that they already have, such as use of spreadsheets and SQL. This is convergent with other findings about training needs, which include applying analytics to business problems (54%), training on big data analytics tools (53%), analytic concepts and techniques (46%) and visualizing big data (41%). The data shows that as approaches become more standardized and the market focus shifts toward them from customized implementations, skill needs are shifting as well. This is not to say that demand is moving away from the data scientist completely. According to our research, organizations that involve cross-functional teams or data scientists in the deployment process are realizing the most significant impact. It is clear that multiple approaches for personnel, departments and current vendors play a role in deployments and that some approaches will be more effective than others.

Cloud computing is another key consideration with respect to deploying analytics systems as well as sandbox modelling and testing environments. For deployment of big data analytics, 27 percent of companies currently use a cloud-based method, while 58 percent said they do not and 16 percent do not know what is used. Not surprisingly, far fewer IT professionals (19%) than business users (40%) said they use cloud-based deployments for big data analytics. The flexibility and capability that cloud resources provide is particularly attractive for sandbox environments and for organizations that lack big data analytic expertise. However, for big data model building, most organizations (42%) still utilize a dedicated internal sandbox environment to build models while fewer (19%) use a non-dedicated internal sandbox (that is, a container in a data warehouse used to build models) and others use a cloud-based sandbox either as a completely separate physical environment (9%) or as a hybrid approach (9%). From this last data we infer that business users are sometimes using cloud-based systems to do big data analytics without the knowledge of IT staff. Among organizations that are not using cloud-based systems for big data analytics, security (45%) is the primary reason that they do not.

Perhaps the most important consideration for big data analytics is choosing vendors to partner with to achieve organizational objectives. When we understand the move from custom technological approaches to more packaged ones and the types of analytics currently being implemented for big data, it is not surprising that a majority of research participants (52%) are looking to their business intelligence systems providers to supply their big data analytics solution. However, a significant number of companies (35%) said they will turn to a specialist analytics provider or their database provider (34%). When evaluating big data analytics, usability is the most important vendor consideration but not by as wide a margin as in categories such as business intelligence. A look at criteria rated important and very important by research participants reveals usability is the highest ranked (94%), but functionality (92%) and reliability (90%) follow closely. Among innovative new technologies, collaboration is important (78%) while mobile access (46%) is much less so. Coupled with the finding that communication and knowledge sharing combined is an important benefit of big data analytics, it is clear that organizations are cognizant of the collaborative imperative when choosing a big data analytics product.

Deployment of big data analytics starts with forethought and a well-defined business case that includes the expected benefits I discussed in my previous analysis. Once the outcome-driven framework is established, organizations should consider the types of analytics needed, the enabling technologies and the people and processes necessary for implementation. To learn more about our big data analytics research, download a copy of the executive summary here.

Regards,

Tony Cosentino

VP & Research Director


 

 Copyright © 2013 Ventana Research All Rights Reserved :: Privacy Statement :: Contact Us ::