When reviewing design or performance of a MOLAP analysis services database we need to determine where the space. Common problems are:
- We have too many or too large aggregations
- If we have too many bitmap indexes
- Large attributes with string or binary data in them
- If some fact tables are growing too much.
The only real way to do this is to look at the size of various files on disk, but this can get pretty time consuming with tens of thousands of files for even a small project like Analysis Services and many different types of files to analyse.
I have written an excel macro which will read all the files in a folder and present a PivotTable to help analyse space.
We can see from the output below that its actually the customer and product dimension in Adventure Works which are hogging all the space
if we expand the Product dimension we can see that the Large Photo has a “bstore” file for storing a binary image. Should we put this in a cube – maybe not.
The table below shows the space breakdown by file type. We can see that the Adventure Works cube disk space mostly consumed by Fact data rather than Bitmap Indexes and Aggregations which often can run away on customer sites.
The excel macro spreadsheet is available for download here
Update: I had a few people saying that the file link didn't appear in chrome, so here is a direct link