In-Memory Analysis: Delivering Insights at the Speed of Thought
Although BI has become a mainstream market, innovation has not stopped. In fact, by all accounts, the technology driving BI is changing more rapidly now than any other time in its 20-year history. This avalanche of innovation is divided between front- and back-end technologies.
On the front end, tools are incorporating more visual, interactive interfaces that are making it easier for users to interact with data and create their own views of information. On the back end, advances in memory, CPU and disk technology have enabled BI vendors to exploit in-memory databases and intelligent caches and specialized analytical databases and platforms that offer dramatically improved price-performance over previous generations of database technology. Never before have BI customers had so many options to store, access, analyze and consume information for decision making.
This innovation raises the question: What are the capabilities of next-generation BI tools? Certainly, the business mantra of “faster, better, cheaper” is becoming a reality. But we can also add visual, interactive, analytical, scalable, manageable, collaborative and mobile. Collectively, many of these capabilities get lumped together under the heading “self-service BI,” which has been the holy grail of BI for nearly two decades. The more end users can interact with the data to create their own views, the more satisfied and productive they’ll be with BI tools and the more corporate BI teams can focus on value-added activities, instead of creating an endless stream of custom reports and dashboards.
More specifically, next-generation BI tools seamlessly blend the capabilities of top-down, metrics-driven reporting with bottom-up, ad hoc analyses, making it easy for users to meet their own information needs once an IT person, power user or superuser has done the initial setup. Top-down tools expose semantic layers and widget libraries built by IT professionals and allow superusers to build ad hoc reports and dashboards (i.e., “mashboards”). Conversely, bottom-up tools, such as popular visual analysis tools, let power users and superusers explore data culled from a variety of back-end systems and build fast, highly interactive dashboards for their departmental colleagues.
Finally, you can’t discuss next-generation BI tools without examining their back-end data architectures. To deliver the highest level of query performance possible, many new BI tools store data locally in an in-memory database or intelligent mid-tier cache. Others query back-end databases directly, relying on ROLAP (relational online analytical processing) SQL generation capabilities or the power of analytical platforms to handle complex queries and deliver super fast performance. And some tools give users the flexibility of caching data locally or querying back-end databases, depending on business requirements and systems availability.
Read the entire study.