Getting Started with Adobe After Effects - Part 6: Motion Blur
First Time? You can support us by signing up. It takes only 5 seconds. Click here to sign up. If you already have an account, click here to login.
Loading

1st Prize - Apple iPad


SQLServer Quiz 2011 - Troubleshooting Cube Performance

  • Working with huge data is very common when it is about Data Warehousing. It is necessary to create Cubes on the data to make it meaningful and consumable. There are cases when retrieving the data from cube takes lots of the time.

    Let us assume that your cube is returning you data very quickly. Suddenly on one day it is returning the data very slowly. What are the three things will you in order to diagnose this. After diagnose what you will do to resolve performance issue.

    Posted on 06-05-2011 00:00 |
    Pinal Dave
    150 · 1% · 326

11  Answers  

Subscribe to Notifications
Previous 1 | 2 Next
  • Score
    5

    There might not be enough Ram available to Retrieve the Data, as when the ETL Completes, it has used up most of the Memory on the Server

    Replied on Jun 6 2011 3:42AM  . 
    wbiggy
    2366 · 0% · 5
  • Score
    5

    Utilize the SSAS Profiler and the Activity viewer on codeplex. Also I would turn on perfmon with various SSAS related counters including those related to memory utilization, deadlocks, subcubes, locks, aggregations and of course non SSAS related counters. Based on the results of these three tools, would could potentially see if it is single poorly written MDX query which is causing the issue (like some of those coming out of Excel when groupings are used). Or maybe the server is running out of memory (although you will likely get an error in that instance). Also, many SSAS instances also run other tools like SSRS etc, and it could be a query in SSRS that is bogging down the server. Also, the profilers and activity viewer could quickly show you if you have a large amount of activity for the current period, maybe a month end situation.

    Replied on Jun 6 2011 6:42AM  . 
    rsmurray
    1249 · 0% · 20
  • Score
    5

    Presuming the cubes are MOLAP I would check environmental issues with the server/network, then look into aggregations and partition design. The data volume/complexity may have reached a volume where aggregations and partition design are not optimal

    Replied on Jun 8 2011 9:50AM  . 
    Iain
    569 · 0% · 65
  • Score
    5

    First check the load on server. What process are using more resurces. i other then SQL kill them

    otherwise, Profiler to diagnose the cause

    Replied on Jun 9 2011 5:40AM  . 
    Alpesh Patel
    36 · 5% · 1663
  • Score
    10

    If the symptom is a sudden decrease of speed, I would use the following tools:

    • SSAS Profiler: profile the whole query and analyze what part of the execution is time-consuming. There are two main issues that could cause the performance issue: the Formula engine could be using too much time because complex calculations are necessary, and the subcube queries could not hit the cache any more (or not too often)
    • Performance Monitor: there is a very good page here that explains which performance counters display useful hints. The key counter here would be "Memory Usage KB", which should be compared against "Memory Limit High" of the SSAS:Memory section.

    In general, I would suspect that the amount of data used to calculate the query result has reached the memory limit of the server. It is well known that the query performance, expecially if string comparisons are used in the query, highly degrades when not all data can be kept in memory buffers. SSAS then switches to disk mode and reads the data again and again.

    If SSAS Profiler show that the query is the only one executing, I would suggest to add more RAM to the machine. If that is not possible, partitioning the fact data could solve the problem, if the query can be formulated that it only accesses some partitions (so memory consumption of cache is less). If SSAS Profiler shows that other queries are processed in parallel, you should try and serialize the queries.

    If partitioning is already done, sou should examine the "Process" entries ans see, if too much partitions are accessed by the query. In this case, rewriting the query or (if it's the main query on the cube) redefinition of the partitioning algorithm would be an option.

    If all that does not help - put on your head light and dive into the system :-)

    Replied on Jun 9 2011 2:22PM  . 
    Guenter
    28 · 6% · 1887
  • Score
    5

    We can do following things 1.We can use Performance Advisor for Analysis Services :http://www.sqlsentry.net/performance-advisor/sql-server-analysis-services.asp 2.Have a look at the Designed cache structure for SSAS 3.Look into partition stuff and analysis according to performance

    Replied on Jun 15 2011 12:40PM  . 
    Vishal Pawar
    24 · 7% · 2229
  • Score
    5

    First:Restart a Machine if not Restart Sql Server Services Second: Check Memory Deadlock. Third: Profiler to identifier the resources.

    Replied on Jun 16 2011 11:12PM  . 
    Joggee
    2366 · 0% · 5
  • Score
    0

    Very interesting suggestions so far but this subject needs more explanation.

    Replied on Jun 21 2011 1:24AM  . 
    Pinal Dave
    150 · 1% · 326
  • Score
    5
    • Need to revisit the aggregates in the tune them if required.
    • Partition slice the cube.
    • Need to check the buffer size.
    Replied on Jun 24 2011 5:43AM  . 
    XploringSQL
    2366 · 0% · 5
  • Score
    0

    Thank you all for participating :)

    Replied on Jul 6 2011 12:04AM  . 
    Pinal Dave
    150 · 1% · 326
Previous 1 | 2 Next

Your Answer


Sign Up or Login to post an answer.
Please note that your answer will not be considered as part of the contest because the competition is over. You can still post an answer to share your knowledge with the community.

Copyright © Rivera Informatic Private Ltd.