question

RRNetha avatar image
0 Votes"
RRNetha asked ·

Multiple copies of Azure analysis services Cube

Hi - I need some recommendation in Azure SSAS

We have Power BI dashboards connected to Azure Analysis Service model. Our Cube refresh everyday, however sometimes cube may fail while processing due to invalid/bad data. Since we don't have unique/foreign, etc. constraints available in Synapse pool we can't enforce the data integrity at database level. Can the cube still be accessible from Power BI if the processing fails. What is critical here is availability of the data to Power BI dashboards no matter if the the data is not up to date. I believe we can't cache the Cube data in Power BI site and connection type will be Live.

Is it possible to keep the two copies of the Cube, one(Reporting Cube) used to connect the Power BI and another one will be used to get the latest data from database pool and synchronize it to Reporting cube after successful processing.

Please suggest some recommendations..

Thanks

azure-analysis-services
· 1
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hello @RRNetha ,
We haven’t heard from you on the last response and was just checking back to see if you have a resolution yet .In case if you have any resolution please do share that same with the community as it can be helpful to others . Otherwise, will respond back with the more details and we will try to help .
Thanks
Himanshu

1 Vote 1 ·
NandanHegde-7720 avatar image
1 Vote"
NandanHegde-7720 answered ·

Hey @RRNetha ,
There are 2 aspects:
1) Powerbi can connect to Cube via import mode as well wherein the data would reside in powerBi service.
But in case if the mode is connect Live and the cube refresh fails, the cube still has stale/legacy data.
So the PowerBi report would still be showing data but the legacy ones

Also, even though SQL DW doesn't allow all features similar to SQL, you an control and validate the data before ingesting into SQL DW,
There are also tools like Great Expectations that help in data quality testing.

2) Why do you want 2 copies of cube ? Because irrespective of whether the cube is getting processed or not , the reporting cube would either show latest data or legacy one.
So there is no use of having 2 cubes as both would serve the same purpose as 1 cube itself but with additional maintenance .

· 1 ·
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Thanks @NandanHegde-7720 .

I am convinced with your #1 above. I was thinking if the cube processing fails then data can't be accessed like in Multi dimensional analysis service but based on your response we can still use the cube even it fails while processing. I think this will serve my purpose.

based on your #1 answer, we dont need two copies of cube as you mentioned in answer #2.

Thanks for help! Appreciate it.

0 Votes 0 ·
HimanshuSinha-MSFT avatar image
1 Vote"
HimanshuSinha-MSFT answered ·

Hello @RRNetha ,
Thanks for the ask and using the Microsoft Q&A platform .
I am not clear on few points but let me share what I know hopefully that helps . In PBi the data are refreshed on a schedule . So if the cube refresh passes the latest data re going be vieabale only after the next refresh . So in your case if the refresh fails , I think we can stop the next refresh that way you will have the stale data .

Just FYI we also have a powerBI forum and you can post the future ask there also .
https://community.powerbi.com/


Please do let me know how it goes .
Thanks
Himanshu

·
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.