Source: MicroStrategy University, Deploying MicroStrategy High Performance BI, V9.3.1, MicroStrategy, Inc. September, 2013.
Computational Distance
Any BI system consist of a series of processes and tools that take raw data at the very bottom-at the transaction level in a database-and by using various technologies transform that data into the finished answer that the user needs. At every step along the way, some kind of processing is done in the following components-the database, network, BI application, or the browser.
The concept of “computational distance” refers to the length in terms of systems, transformations, and other processes that the data must undergo from its lowest level, all the way to being rendered on a browser as shown in the image above.
The longer the computational distance is for a given report, the longer it will take to execute and render. The preceding image shows a hypothetical example of a report that runs in 40 seconds. Each processing step on that report, such as aggregation, formatting, and rendering, adds to the report’s computational distance, increasing the report overall execution time.
Reducing the Computational Distance of a Report
Computation distance offers a useful framework from a performance optimization perspective because it tells us that to improve the performance of a report or dashboard, you must focus on reducing its overall computational distance. The following image shows different techniques such as caching, cubes, and aggregation that can be used to optimize performance for the 40 second hypothetical report.
In the next blog post, we will next look at two key computational distance reduction techniques offered in the MicroStrategy platform-caching and Intelligent Cubes.