Cross-region Workspace migration using PowerShell Scripts

Let’s continue the topic of Workspace migration to a Capacity in different Azure Region. There was already a lot covered around this topic. Main problem statement can be found in this article, and last article – PowerShell 101 – Introduction to task automation – explains basic PowerShell concepts, that (hopefully) will help you understand better the script. Full version of the script can be downloaded from my GitHub.

Quick recap

Goal of this exercise is to move Power BI Workspaces to a Capacity in different Region. As we know from first article on this topic, this might be problematic if we have Large Semantic Models stored in our Workspaces. This type of storage doesn’t allow to move Semantic Models to a different region, leading to a broken Power BI Reports. Semantic Models can be moved to a new Region if their storage format is changed back to Small Semantic Model, assuming of course that model’s size allows this operation (Model must be smaller than 10 GB).
 
With help of PowerShell, I created a script that will take care of Region-to-Region migration, while handling Large Semantic Models.
 
Script isn’t fully automated for two reasons. First of all, I didn’t cover here Service Principals which are required for automation. Of course there are other materials online, but I usually try to cover end to end scenarios. Second and more important reason is that there can always be some hiccups down the road:
  • Workspace can’t be migrated, because it contains Fabric Items
  • Model can’t be converted because it’s too big
  • and so on…
In this case, it’s good to be around and resolve all the failed items manually. After all, we don’t aim to achieve full automation here, but to reduce the time needed for migration where we can.
 

Prerequisites

You need the following to be able to run the script:
  • Tenant Admin role activated on your account
  • Environment to run the code: PowerShell ISE or Visual Studio Code
  • Finish configuration in the code in Configuration section:
    • scriptMode: (0 – to fetch all workspaces from specified Capacity, 1 – to provide list of workspaces manually)
      upn – your organizational user email, it will be used to grant Workspace access in case it is missing
    • sourceCapacityId/sourceWorkspaces: depending on Script Mode selected, provide either Capacity ID or IDs of Workspace you would like to migrate
    • targetCapacityId: Capacity where workspaces will be migrated
    • outputFilesPath: if provided, script will save logs to given location

How the script works

From a scripting point of view, we have few problems to solve:
  1. A lot of commandlets used require -Scope Organization and Admin access granted for user who runs the script.
  2. Input parameters may be incorrect:
    1. If Source / Target Capacities are incorrect, the script will stop.
    2. If one or more Workspaces from manual input are incorrect, script will simply grab the next one on the list.
  3. Commandlet used to convert Semantic Models doesn’t work from Tenant Admin level. It means, that user must have physical access to a Workspace. Script handles it by running a simple commandlet. Note: commandlet will work even if user has a Viewer role in Workspace, which will not be efficient to convert Semantic Models. This is huge simplification of the code, but for someone who has Tenant Admin access in organization it’s fair to assume that this is rather unlikely scenario.
  4. One or more Semantic Models can’t be converted to Small Storage Format. In this case we don’t want to move workspace to new Capacity, risking that reports will be broken. For all the Semantic Models that were converted during the process, script will now convert them back to Large Storage Format, to keep the workspace in original state. In this case, logs will tell us which Semantic Models caused an issue, so we can manually fix them.
  5. Workspace can’t be moved to new Capacity too soon after the Semantic Model conversion. When commandlet is used to change storage format from Large to Small, the change appears to be immediate but it’s not. When conversion is done manually in Power BI Service, we see the “spinning wheel” which tells that conversion is not complete yet. In this case, if Workspace is moved while the wheel is still spinning, we will damage the models causing them to remain in “unresolved” state. To make sure this doesn’t happen, script runs another commandlet that checks Workspace Migration status. Script will check the status every 30 seconds and will proceed only when migration is complete.
Script contains a lot of comments to make it easier to understand how it works. However, I will try to explain the algorithm on high-level. We will start with main part of program:
 
Region-to-Region migration of Power BI Workspaces: Main Program
Figure 1. Region-to-Region migration of Power BI Workspaces: Main Program.
  1. Script starts with Configuration part, mentioned in Prerequisites section.
  2. Then, we are asked to authenticate to Power BI Service, by logging into Microsoft Account. This process acquires access token, that is later used to authorize the connection.
  3. Script will check if provided input parameters for Target Capacity (and Source Capacity if selected) are correct. It calls PowerShell Commandlet to check specific Capacity ID. If it’s available, it will grab the Capacity name to use it later in the script. If input parameters are incorrect, the script will end.
  4. Script checks the scriptMode. Depeding on the choice, it will either get all Workspaces from specified Source Capacity or will use manually inserted Workspace List.
  5. Here script enters a Loop that will be executed for every single Workspace in the list. It will be described in next section: Processing Workspaces.
  6. If Output Path was provided in Configuration, script will save logs (converted and failed Semantic Models) as csv files.
  7. Script ends here -> connection to Power BI Service is closed.

Processing Workspaces

Let’s start with high-level diagram:
Region-to-Region migration of Power BI Workspaces: Processing Workspaces
Figure 2. Region-to-Region migration of Power BI Workspaces: Processing Workspaces.
  1. Script runs until the list of Workspaces is exhausted.
  2. Workspace ID is fetched from the list.
  3. Script calls PowerShell commandlet to check if provided Workspace ID is correct. If it’s incorrect, it proceeds with next item on the list.
  4. Script is searching for Large Semantic Models (LSMs) in given Workspace.
  5. If there are no LSMs in the Workspace, script with move Workspace to a new Capacity.
  6. When LSMs are found, script starts another procedure to process Large Semantic Models. Described in detail in next section.
  7. Script checks for Conversion Error flag. If it’s set to 1, it means that at least one Semantic Model couldn’t be converted from Large to Small Semantic Model Storage Format. In this case, we want to keep the workspace in the same state as it was. Script will jump to step, where all Semantic Models are converted back to LSMs.
  8. If there were no errors, script will move Workspace to new Capacity. When this is done, it will convert back all Semantic Models to Large Storage Format.
  9. Script checks for another flag, indicating whether it was necessary to grant Workspace level access for the user who runs the script. If access was granted, script will not revoke the access.
  10. Script continues with step 1.

Processing Large Semantic Models

This is the last subprocess we are going to discuss:
 
Region-to-Region migration of Power BI Workspaces: Processing Large Semantic Models
Figure 3. Region-to-Region migration of Power BI Workspaces: Processing Large Semantic Models.
  1. If no LSMs are found in workspace, there is no need to check for Workspace Level access, that’s why this step is included internally in procedure to handle LSMs. If Workspace access is missing, it is granted and accessGranted flag is set to 1. It is later used to revoke the access from the User.
  2. Script now continues to run until the list of LSMs is exhausted. 
  3. Script grabs Semantic Model ID from the list.
  4. Attempt to convert Semantic Model to Small Storage Format.
  5. When conversion fails, conversionErrors flag is set to 1. It will be later used to make decision if Workspace can be migrated to new Capacity.
  6. Script continues with next Semantic Model ID.
  7. When there are no more models to process, script runs a commandlet to check the conversion status. It will continue only when conversion status is resolved on all Semantic Models. Then script goes back to Processing Workspace procedure.
Script will follow described process for Semantic Models and all Workspaces in the scope.

 

Interaction with script

Script contains a lot of steps where feedback is sent for End User in the terminal. It allows to track the progress of the script and help identify issues if any. There are four types of messages:
  • Information – blue color
  • Warning – yellow color
  • Success – green color
  • Error – red color
I will demonstrate how they work in test scenario covered in next section.

 

Test scenario

I created three workspaces for the test purpose, for each of them I expected a different outcome:
  1. TEST Migration Failure: The most complex scenario. First of all, there is a Large Semantic Model in the Workspace, which is bigger than 10 GB. Here I expect that conversion will not be successful, and Workspace won’t be migrated to new Capacity. Also, I don’t have access to this Workspace, therefore script will have grant this access for me, and revoke it later.
  2. TEST Migration Success – Large Models: Simpler scenario. Workspace contains Semantic Models converted to Large Storage Format; however, their size should allow the conversion to Small Storage Format, and Workspace in the end should be migrated to a new Capacity.
  3. TEST Migration Succes – Small Models: The simplest scenario. There are no Large Models there, I expect no conversion and move the Workspace to new Capacity.
Let’s run the script and see how it goes:
Region-to-Region Power BI Workspace Migration. Run TEST Migration Failure Workspace.
Figure 4. Run “Test Migration Failure” Workspace.
  1. At the very top we see Moving workspaces to ‘…..’. This is the result of Target Capacity validation.
  2. We clearly see below which workspace is being processed at the time.
  3. Script identified 4 Semantic Models with Large Storage Format.
  4. As expected, script should notice that I don’t have access to Workspace and I should get one, which is confirmed with success message.
  5. Failed to update ‘large semantic model…’ informs us that one of the models couldn’t be converted.
  6. As below information confirms, only 3/4 Semantic Models were converted. This means for us that we have to wait for all the conversions to finish, before we can roll back the change.
  7. While waiting for conversion to finish, script will check every 30 seconds the conversion status on a Workspace level. Query Count: 1 part tells you how many times you tried already, just to give you the sense of control and confirming that indeed script is still working.
  8. In the end we get clear message that at least one model couldn’t be converted, and we should check for the results.
  9. Since only 3 Semantic Models were converted, the same number is being converted back to LSMs.
  10. You see confirmation that SMs were successfully converted to a PremiumFiles format – Large Storage Format.
  11. Removing Admin access from this workspace.
Region-to-Region Power BI Workspace Migration. Run TEST Migration Success - Large Models
Figure 5. Run “TEST Migration Success – Large Models”.
  1. Now we process the Workspace were all conversions should succeed and Workspace will be moved to new Capacity.
  2. At the beginning you can see a lot of the same messages. First difference is when we Converted datasets 3/3, and green message confirming that indeed, all items in workspace have been converted. Workspace is moved to new Capacity as explained in green message.
  3. The rest remains the same, semantic models are converted back and script ends.
Region-to-Region Power BI Workspace Migration. Run TEST Migration Success - Small Models
Figure 6. Run “TEST Migration Success – Small Models”.
Last test is the shortest one as you can see. No Large Semantic Models were detected, and Workspace is simply moved to a new Capacity.

 

Conclusion

Script seems to work just fine. All three different test scenarios have passed the test as per expectations. If you find yourself in a situation, where you will move to a new / cheaper Capacity, I hope my script will help. However, beyond the usefulness, it was pure fun creating this script, so I hope it can server also for educational purposes.
 
As a next step, I will try to create similar script but using Fabric Notebooks. Stay tuned 🙂
 
For now, as always, thank you for staying till the end, and see you in next article 🙂
Picture of Pawel Wrona

Pawel Wrona

Lead author and founder of the blog | Works as a Power BI Architect in global company | Passionate about Power BI and Microsoft Tech

Did you enjoy this article? Share with others :)

5 2 votes
Article Rating
Comments notifications
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

Related Posts