diff --git a/README.md b/README.md index f805ac8..9837fa6 100644 --- a/README.md +++ b/README.md @@ -61,7 +61,6 @@ The main advantage of the module is the ability to publish all the Azure Data Fa - [How it works](#how-it-works) - [Step: Create ADF (if not exist)](#step-create-adf-if-not-exist) - [Step: Load files](#step-load-files) - - [Step: Pre-deployment](#step-pre-deployment) - [Step: Replacing all properties environment-related](#step-replacing-all-properties-environment-related) - [Column TYPE](#column-type) - [Column NAME](#column-name) @@ -215,6 +214,7 @@ $opt = New-AdfPublishOption * [Boolean] **DoNotStopStartExcludedTriggers** - specifies whether excluded triggers will be stopped before deployment (default: *false*) * [Boolean] **DoNotDeleteExcludedObjects** - specifies whether excluded objects can be removed. Applies when `DeleteNotInSource` is set to *True* only. (default: *true*) * [Boolean] **IncrementalDeployment** - specifies whether Incremental Deployment mode is enabled (default: *false*) +* [String] **IncrementalDeploymentStorageUri** - indicates Azure Storage where the latest deployment state file is stored (no default) * [Enum] **TriggerStopMethod** - determines which triggers should be stopped. Available values: `AllEnabled` (default) | `DeployableOnly` Find more about the above option in section [Step: Stoping triggers](#step-stoping-triggers) @@ -398,7 +398,8 @@ graph LR; You must have appropriate permission to create new instance. *Location* parameter is required for this action. -If ADF does exist and `IncrementalDeployment` is ON, the process gets Global Parameters to load latest **Deployment State** from ADF. +If ADF does exist and `IncrementalDeployment` is ON, the process loads latest **Deployment State** from Storage. +Note: The above flag will be disabled when related parameter (`IncrementalDeploymentStorageUri`) is empty. ## Step: Load files @@ -407,15 +408,6 @@ If ADF does exist and `IncrementalDeployment` is ON, the process gets Global Par This step reads all local (json) files from a given directory (`rootfolder`). -## Step: Pre-deployment - -💬 In log you'll see line: `STEP: Pre-deployment` - -It prepares new (empty) file in `factory` folder if such file doesn't exist. -The file is needed for further steps to keep Deployment State in Global Parameter. - -> This step is enable only when `IncrementalDeployment` is ON and `DeployGlobalParams` is ON. - ## Step: Replacing all properties environment-related 💬 In log you'll see line: `STEP: Replacing all properties environment-related...` @@ -640,11 +632,12 @@ The mechanism is smart enough to publish all objects in the right order, thence 💬 In log you'll see line: `STEP: Updating (incremental) deployment state...` -After the deployment, in this step the tool prepares the list of deployed objects and their hashes (MD5 algorithm). The array is wrap up in json format and stored as new global parameter `adftools_deployment_state` in factory file. +After the deployment, in this step the tool prepares the list of deployed objects and their hashes (MD5 algorithm). +The array is wrap up in json format and stored as blob file `{ADF-Name}.adftools_deployment_state.json` in provided Storage. **Deployment State** speeds up future deployments by identifying objects have been changed since last time. -> The step might be skipped when `IncrementalDeployment = false` OR `DeployGlobalParams = false` in *Publish Options*. -> You'll see warning in the console (log) when only `IncrementalDeployment = true`. +> The step might be skipped when `IncrementalDeployment = false` in *Publish Options*. +> You'll see warning in the console (log) when `IncrementalDeployment = true` and `IncrementalDeploymentStorageUri` is empty. ## Step: Deleting objects not in source @@ -670,26 +663,29 @@ Since v.1.6 you have more control of which triggers should be started. Use `Trig ## Incremental Deployment -> This is new feature (ver.1.4) in public preview. +> This is new feature (ver.1.4) in public preview. Since ver.1.10 the process doesn't use ADF Global Parameter to keep Deployment State data. You must provide Storage URL instead. Usually the deployment process takes some time as it must go through all object (files) and send them via REST API to be deployed. The more objects in ADF the longer process takes. In order to speed up the deployment process, you may want to use new switch `IncrementalDeployment` (new in *Publish Options*) to enable smart process of identify and deploy only objects that have been changed since last deployment. ### How it works? -It uses **Deployment State** kept in one of Global Parameters and is save/read to/from ADF service. +It uses **Deployment State** kept in as json file and is write/read to/from Azure BLOB Storage. When the mode is ON, the process does a few additional steps across entire deployment process: -1. Reads Global Parameters from ADF (when not newly created) to get previous **Deployment State** +1. Reads Deployment State (json file) from Storage to get previous **Deployment State** 2. Identifies which objects are unchanged and excludes them from deployment 3. Calculates MD5 hashes of deployed objects and merges them to previous **Deployment State** -4. Saves **Deployment State** as `adftools_deployment_state` global parameter +4. Saves **Deployment State** as `{ADFName}.adftools_deployment_state.json` file in Storage + +> Note: In order to use this feature, the following option parameters must be set: +> - `IncrementalDeployment` = `True` +> - `IncrementalDeploymentStorageUri` = `https://sqlplayer2020.file.core.windows.net/adftools` (example) ### Remember * Incremental Deployment assumes that no one changes ADF objects manually in the cloud -* You must deploy Global Parameters in order to save Deployment State * Objects' hashes are calculate after update of properties. If you change config for an object - it will be deploy * If you want to redeploy all objects again, you've got two options: * Set `IncrementalDeployment = false` OR - * Delete manually `adftools_deployment_state` global parameter in target ADF service + * Delete Deployment State (json) file manually from provided Storage account's location # Selective deployment, triggers and logic diff --git a/azure.datafactory.tools.psd1 b/azure.datafactory.tools.psd1 index 7e77169..c3aca01 100644 --- a/azure.datafactory.tools.psd1 +++ b/azure.datafactory.tools.psd1 @@ -12,7 +12,7 @@ RootModule = 'azure.datafactory.tools.psm1' # Version number of this module. - ModuleVersion = '1.10.0' + ModuleVersion = '1.10.1' # Supported PSEditions # CompatiblePSEditions = @() diff --git a/changelog.md b/changelog.md index 5fcc871..4e1de37 100644 --- a/changelog.md +++ b/changelog.md @@ -4,9 +4,14 @@ All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/). +## [1.10.1] - 2024-10-24 +### Fixed +* Incremental deploy feature causes payload limit issue #374 + ## [1.10.0] - 2024-08-06 ### Fixed * Trigger Activation Failure Post-Selective Deployment when TriggerStopMethod = `DeployableOnly` #386 +* Significantly improved performance of unit tests by mocking target ADF ## [1.9.1] - 2024-06-17 ### Fixed diff --git a/en-us/messages_index.md b/en-us/messages_index.md index 5157e44..ff07149 100644 --- a/en-us/messages_index.md +++ b/en-us/messages_index.md @@ -37,4 +37,5 @@ ADFT0028 | Expected format of name for 'FullName' input parameter is: objectType ADFT0029 | Unknown object type: *type*. ADFT0030 | AzType '$AzType' is not supported. ADFT0031 | Empty value in config file. Path: [*path*]. Check previous warnings. -ADFT0032 | The process is exiting the function. Do fix the issue and run again. \ No newline at end of file +ADFT0032 | The process is exiting the function. Do fix the issue and run again. +ADFT0033 | Incremental Deployment Option DISABLED as Storage Uri is not provided. \ No newline at end of file diff --git a/private/!AdfPublishOption.class.ps1 b/private/!AdfPublishOption.class.ps1 index e0e32ce..e5736ff 100644 --- a/private/!AdfPublishOption.class.ps1 +++ b/private/!AdfPublishOption.class.ps1 @@ -23,6 +23,7 @@ class AdfPublishOption { [Boolean] $DoNotStopStartExcludedTriggers = $false [Boolean] $DoNotDeleteExcludedObjects = $true [Boolean] $IncrementalDeployment = $false + [String] $IncrementalDeploymentStorageUri = '' [TriggerStopTypes] $TriggerStopMethod = [TriggerStopTypes]::AllEnabled [TriggerStartTypes] $TriggerStartMethod = [TriggerStartTypes]::BasedOnSourceCode } diff --git a/private/DeploymentState.class.ps1 b/private/DeploymentState.class.ps1 index 861680e..9b53a7c 100644 --- a/private/DeploymentState.class.ps1 +++ b/private/DeploymentState.class.ps1 @@ -40,33 +40,66 @@ class AdfDeploymentState { } -function Get-StateFromService { +function Get-StateFromStorage { [CmdletBinding()] - param ($targetAdf) + param ( + [Parameter(Mandatory)] $DataFactoryName, + [Parameter(Mandatory)] $LocationUri + ) - $res = Get-GlobalParam -ResourceGroupName $targetAdf.ResourceGroupName -DataFactoryName $targetAdf.DataFactoryName - $d = @{} + $moduleName = $MyInvocation.MyCommand.Module.Name + $moduleVersion = (Get-Module -Name $moduleName).Version.ToString() + $Suffix = "adftools_deployment_state.json" + $ds = [AdfDeploymentState]::new($moduleVersion) + $storageAccountName = Get-StorageAccountNameFromUri $LocationUri + $storageContext = New-AzStorageContext -UseConnectedAccount -StorageAccountName $storageAccountName + $blob = [Microsoft.Azure.Storage.Blob.CloudBlockBlob]::new("$LocationUri/$DataFactoryName.$Suffix") + Write-Host "Ready to read file from storage: $($blob.Uri.AbsoluteUri)" - try { - $InputObject = $res.properties.adftools_deployment_state.value.Deployed - $d = Convert-PSObjectToHashtable $InputObject - } - catch { - Write-Verbose $_.Exception - } - - return $d + $storageContainer = Get-AzStorageContainer -Name $blob.Container.Name -Context $storageContext + $folder = $blob.Parent.Prefix + $FileRef = $storageContainer.CloudBlobContainer.GetBlockBlobReference("$folder$DataFactoryName.$Suffix") + if ($FileRef.Exists()) { + $FileContent = $FileRef.DownloadText() + #Write-Host $FileContent -BackgroundColor Blue + $json = $FileContent | ConvertFrom-Json + $ds.Deployed = Convert-PSObjectToHashtable $json.Deployed + $ds.adftoolsVer = $json.adftoolsVer + $ds.Algorithm = $json.Algorithm + $ds.LastUpdate = $json.LastUpdate + Write-Host "Deployment State loaded from storage." + return $ds + } + else { + Write-Host "No Deployment State found." + } + return $ds } +function Set-StateToStorage { + [CmdletBinding()] + param ( + [Parameter(Mandatory)] $ds, + [Parameter(Mandatory)] $DataFactoryName, + [Parameter(Mandatory)] $LocationUri + ) + $Suffix = "adftools_deployment_state.json" + $dsjson = ConvertTo-Json $ds -Depth 5 + Write-Verbose "--- Deployment State: ---`r`n $dsjson" -class AdfGlobalParam { - $type = "Object" - $value = $null + Set-Content -Path $Suffix -Value $dsjson -Encoding UTF8 + $storageAccountName = Get-StorageAccountNameFromUri $LocationUri + $storageContext = New-AzStorageContext -UseConnectedAccount -StorageAccountName $storageAccountName + $blob = [Microsoft.Azure.Storage.Blob.CloudBlob]::new("$LocationUri/$DataFactoryName.$Suffix") + $r = Set-AzStorageBlobContent -ClientTimeoutPerRequest 5 -ServerTimeoutPerRequest 5 -CloudBlob $blob -File $Suffix -Context $storageContext -Force - AdfGlobalParam ($value) - { - $this.value = $value - } + Write-Host "Deployment State saved to storage: $($r.BlobClient.Uri)" +} +# Function to get Storage Account name from URI +function Get-StorageAccountNameFromUri($uri) { + $accountName = ($uri -split '\.')[0].Substring(8) # Assumes URI starts with "https://" + return $accountName } + diff --git a/private/GlobalParam.ps1 b/private/GlobalParam.ps1 index 2064c39..dde71fd 100644 --- a/private/GlobalParam.ps1 +++ b/private/GlobalParam.ps1 @@ -64,6 +64,7 @@ function Set-GlobalParam([Adf] $adf) } catch { Write-Error -Exception $_.Exception + $response = "" } return $response } diff --git a/public/Publish-AdfV2FromJson.ps1 b/public/Publish-AdfV2FromJson.ps1 index 65e4c2d..3566a27 100644 --- a/public/Publish-AdfV2FromJson.ps1 +++ b/public/Publish-AdfV2FromJson.ps1 @@ -142,6 +142,12 @@ function Publish-AdfV2FromJson { $opt = New-AdfPublishOption } + if ([string]::IsNullOrEmpty($opt.IncrementalDeploymentStorageUri) -and $opt.IncrementalDeployment) + { + Write-Warning "ADFT0033: Incremental Deployment Option DISABLED as Storage Uri is not provided." + $opt.IncrementalDeployment = $false + } + if (!$DryRun.IsPresent) { Write-Host "STEP: Verifying whether ADF exists..." @@ -149,8 +155,8 @@ function Publish-AdfV2FromJson { if ($targetAdf) { Write-Host "Azure Data Factory exists." if ($opt.IncrementalDeployment -and !$DryRun.IsPresent) { - Write-Host "Loading Deployment State from ADF..." - $ds.Deployed = Get-StateFromService -targetAdf $targetAdf + Write-Host "Loading Deployment State from Storage..." + $ds = Get-StateFromStorage -DataFactoryName $DataFactoryName -LocationUri $opt.IncrementalDeploymentStorageUri } } else { @@ -185,28 +191,28 @@ function Publish-AdfV2FromJson { Write-Debug ($adf | Format-List | Out-String) - Write-Host "==================================================================================="; - Write-Host "STEP: Pre-deployment" - if ($opt.IncrementalDeployment -and $opt.DeployGlobalParams) { - Write-Host "Incremental Deployment Mode: Preparing..." - Write-Debug "Incremental Deployment Mode: Checking whether factory file exist..." - if ($adf.Factories.Count -eq 0) { - Write-Debug "Creating empty factory file..." - $EmptyFactoryFileBody = '{ "name": "'+ $adf.Name +'", "properties": { "globalParameters": {} } }' - $o = New-Object -TypeName "AdfObject" - $o.Adf = $Adf - $o.Name = $DataFactoryName - $o.Type = 'factory' - $o.Body = $EmptyFactoryFileBody | ConvertFrom-Json - $o.FileName = Save-AdfObjectAsFile -obj $o - $adf.GlobalFactory.FilePath = $o.FileName - $adf.GlobalFactory.body = $EmptyFactoryFileBody - $adf.GlobalFactory.GlobalParameters = $o.Body.Properties.globalParameters - $adf.Factories.Add($o) | Out-Null - Write-Host ("Factories: 1 object created.") - } - Write-Host "Incremental Deployment Mode: Preparation Done" - } + # Write-Host "==================================================================================="; + # Write-Host "STEP: Pre-deployment" + # if ($opt.IncrementalDeployment -and $opt.DeployGlobalParams) { + # Write-Host "Incremental Deployment Mode: Preparing..." + # Write-Debug "Incremental Deployment Mode: Checking whether factory file exist..." + # if ($adf.Factories.Count -eq 0) { + # Write-Debug "Creating empty factory file..." + # $EmptyFactoryFileBody = '{ "name": "'+ $adf.Name +'", "properties": { "globalParameters": {} } }' + # $o = New-Object -TypeName "AdfObject" + # $o.Adf = $Adf + # $o.Name = $DataFactoryName + # $o.Type = 'factory' + # $o.Body = $EmptyFactoryFileBody | ConvertFrom-Json + # $o.FileName = Save-AdfObjectAsFile -obj $o + # $adf.GlobalFactory.FilePath = $o.FileName + # $adf.GlobalFactory.body = $EmptyFactoryFileBody + # $adf.GlobalFactory.GlobalParameters = $o.Body.Properties.globalParameters + # $adf.Factories.Add($o) | Out-Null + # Write-Host ("Factories: 1 object created.") + # } + # Write-Host "Incremental Deployment Mode: Preparation Done" + # } Write-Host "==================================================================================="; Write-Host "STEP: Replacing all properties environment-related..." @@ -283,26 +289,32 @@ function Publish-AdfV2FromJson { Write-Host "==================================================================================="; Write-Host "STEP: Updating (incremental) deployment state..." if ($opt.IncrementalDeployment) { - if ($opt.DeployGlobalParams -eq $false) { - Write-Warning "Incremental Deployment State will not be saved as publish option 'DeployGlobalParams' = false" - } else { - Write-Debug "Deployment State -> SetStateFromAdf..." - $ds.SetStateFromAdf($adf) - $dsjson = ConvertTo-Json $ds -Depth 5 - Write-Verbose "--- Deployment State: ---`r`n $dsjson" - $gp = [AdfGlobalParam]::new($ds) - $report = new-object PsObject -Property @{ - Updated = 0 - Added = 0 - Removed = 0 - } - Update-PropertiesForObject -o $adf.Factories[0] -action 'add' -path 'globalParameters.adftools_deployment_state' -value $gp -name 'type' -type 'factory' -report $report + # if ($opt.DeployGlobalParams -eq $false) { + # Write-Warning "Incremental Deployment State will not be saved as publish option 'DeployGlobalParams' = false" + # } else { + Write-Debug "Deployment State -> SetStateFromAdf..." + $ds.SetStateFromAdf($adf) + # $dsjson = ConvertTo-Json $ds -Depth 5 + # Write-Verbose "--- Deployment State: ---`r`n $dsjson" + #$gp = [AdfGlobalParam]::new($ds) + # $report = new-object PsObject -Property @{ + # Updated = 0 + # Added = 0 + # Removed = 0 + # } + # Update-PropertiesForObject -o $adf.Factories[0] -action 'add' -path 'globalParameters.adftools_deployment_state' -value $gp -name 'type' -type 'factory' -report $report + + #Write-Verbose "Redeploying Global Parameters..." + #$adf.Factories[0].Deployed = $false + #$adf.Factories[0].ToBeDeployed = $true + #Deploy-AdfObject -obj $adf.Factories[0] + # } - Write-Verbose "Redeploying Global Parameters..." - $adf.Factories[0].Deployed = $false - #$adf.Factories[0].ToBeDeployed = $true - Deploy-AdfObject -obj $adf.Factories[0] - } + # https://learn.microsoft.com/en-us/azure/storage/blobs/blob-powershell + # Set-Content -Path "adfdeploymentstate.json" -Value $dsjson -Encoding UTF8 + # $ctx = New-AzStorageContext -UseConnectedAccount -StorageAccountName "sqlplayer2020" + # Set-AzStorageBlobContent -Container "adftools" -File "adfdeploymentstate.json" -Context $ctx -Blob "$DataFactoryName.adfdeploymentstate.json" -Force + Set-StateToStorage -ds $ds -DataFactoryName $DataFactoryName -LocationUri $opt.IncrementalDeploymentStorageUri } else { Write-Host "Incremental Deployment State will not be saved as publish option 'IncrementalDeployment' = false" @@ -320,8 +332,9 @@ function Publish-AdfV2FromJson { $elapsedTime = new-timespan $script:StartTime $(get-date) Write-Host "=============================================================================="; Write-Host " ***** Azure Data Factory files have been deployed successfully. *****`n"; - Write-Host "Data Factory name: $DataFactoryName"; - Write-Host "Region (Location): $location"; + Write-Host " Data Factory name: $DataFactoryName"; + Write-Host "Resource Group name: $ResourceGroupName"; + Write-Host " Region (Location): $location"; Write-Host ([string]::Format(" Elapsed time: {0:d1}:{1:d2}:{2:d2}.{3:d3}`n", $elapsedTime.Hours, $elapsedTime.Minutes, $elapsedTime.Seconds, $elapsedTime.Milliseconds)) Write-Host "=============================================================================="; diff --git a/test/!RunAllTests.ps1 b/test/!RunAllTests.ps1 index 64977d8..4ffe0ef 100644 --- a/test/!RunAllTests.ps1 +++ b/test/!RunAllTests.ps1 @@ -12,7 +12,9 @@ Param( [Switch]$InstallModules ) +Write-Host " ========= ENVIRONMENT ==========" Write-Host "Host Name: $($Host.name)" +Write-Host "PowerShell Version: $($PSVersionTable.PSVersion)" $rootPath = Switch ($Host.name) { 'Visual Studio Code Host' { split-path $psEditor.GetEditorContext().CurrentFile.Path } @@ -25,8 +27,8 @@ $folder = Split-Path $rootPath -Parent Write-Host "Setting new location: $folder" Push-Location "$folder" -Get-Location | Out-Host - +Get-Location +Write-Host " ========= ENVIRONMENT ==========" # Add the module location to the value of the PSModulePath environment variable #$p = [Environment]::GetEnvironmentVariable("PSModulePath") diff --git a/test/Incremental-Deployment.Tests.ps1 b/test/Incremental-Deployment.Tests.ps1 index 7f1898a..c623798 100644 --- a/test/Incremental-Deployment.Tests.ps1 +++ b/test/Incremental-Deployment.Tests.ps1 @@ -4,6 +4,8 @@ BeforeDiscovery { $moduleManifestPath = Join-Path -Path $ModuleRootPath -ChildPath $moduleManifestName Import-Module -Name $moduleManifestPath -Force -Verbose:$false + $m = Get-Module -Name 'azure.datafactory.tools' + $script:verStr = $m.Version.ToString(2) + "." + $m.Version.Build.ToString("000"); } InModuleScope azure.datafactory.tools { @@ -27,6 +29,14 @@ InModuleScope azure.datafactory.tools { $opt.IncrementalDeployment = $true $opt.StopStartTriggers = $false $script:gp = "" + $script:dstate = [AdfDeploymentState]::new($verStr) + $script:dstate.LastUpdate = [System.DateTime]::UtcNow + $script:dstateJson = $script:dstate | ConvertTo-Json + $script:StorageUri= "https://sqlplayer2020.blob.core.windows.net" + $StorageContainer = "adftools" + $StorageFolder = "folder2" + $script:uri = "$StorageUri/$StorageContainer/$StorageFolder" + # https://sqlplayer2020.file.core.windows.net/adftools $script:SrcFolder = "$PSScriptRoot\$($script:DataFactoryOrigName)" $script:TmpFolder = (New-TemporaryDirectory).FullName @@ -40,7 +50,30 @@ InModuleScope azure.datafactory.tools { Option = $opt } + + Describe 'When Incremental mode with storage provided' -Tag 'IncrementalDeployment', 'Integration' { + It 'Should return empty state when get for the first time' { + $script:ds1 = Get-StateFromStorage -DataFactoryName $DataFactoryName -LocationUri "$uri/notexist" + $ds1.GetType().Name | Should -Be 'AdfDeploymentState' + $ds1.Deployed | Should -BeNullOrEmpty + } + It 'Should save state to storage without an error' { + Set-StateToStorage -ds $dstate -DataFactoryName $DataFactoryName -LocationUri $uri + } + It 'Should return the same value for state when read again' { + $ds2 = Get-StateFromStorage -DataFactoryName $DataFactoryName -LocationUri $uri + $ds2.Deployed.Count | Should -Be $script:ds1.Deployed.Count + #$ds2.adftoolsVer | Should -Be $script:ds1.adftoolsVer + $ds2.Algorithm | Should -Be $script:ds1.Algorithm + } + It 'Should fails when Container doesn''t exist' { + { Set-StateToStorage -ds $dstate -DataFactoryName $DataFactoryName -LocationUri "$($script:StorageUri)/nocontainer997755/folder" } + | Should -Throw -ExceptionType ([Microsoft.Azure.Storage.StorageException]) + } + } + Describe 'When deploy ADF in Incremental mode' -Tag 'IncrementalDeployment', 'Unit' { + BeforeAll { Mock Get-AzDataFactoryV2 { @@ -67,21 +100,34 @@ InModuleScope azure.datafactory.tools { $script:TargetAdf.DeployObject($newRes) } - Mock Get-GlobalParam { - $adfi = @{id='/.../ADF/globalParameters/default'; name='default'; type='Microsoft.DataFactory/factories/globalParameters'; properties='' } - $adfi.properties = $script:gp + # Mock Get-GlobalParam { + # $adfi = @{id='/.../ADF/globalParameters/default'; name='default'; type='Microsoft.DataFactory/factories/globalParameters'; properties='' } + # $adfi.properties = $script:gp + # if (IsPesterDebugMode) { + # Write-Host ($adfi | ConvertTo-Json -Depth 10) -BackgroundColor DarkGreen + # } + # return $adfi + # } + + # Mock Set-GlobalParam { + # $adf = $PesterBoundParameters.adf + # $script:gp = ($adf.GlobalFactory.body | ConvertFrom-Json).properties.globalParameters + # if (IsPesterDebugMode) { + # Write-Host ($script:gp | ConvertTo-Json -Depth 10) -BackgroundColor DarkRed + # } + # } + + Mock Set-StateToStorage { + $script:dstate = $PesterBoundParameters.ds if (IsPesterDebugMode) { - Write-Host ($adfi | ConvertTo-Json -Depth 10) -BackgroundColor DarkGreen + Write-Host ($script:dstate | ConvertTo-Json -Depth 10) -BackgroundColor DarkRed } - return $adfi } - - Mock Set-GlobalParam { - $adf = $PesterBoundParameters.adf - $script:gp = ($adf.GlobalFactory.body | ConvertFrom-Json).properties.globalParameters - if (IsPesterDebugMode) { - Write-Host ($script:gp | ConvertTo-Json -Depth 10) -BackgroundColor DarkRed - } + Mock Get-StateFromStorage { + if (IsPesterDebugMode) { + Write-Host ($script:dstate | ConvertTo-Json -Depth 10) -BackgroundColor DarkGreen + } + return $script:dstate } Mock Remove-AzDataFactoryV2LinkedService { @@ -97,13 +143,21 @@ InModuleScope azure.datafactory.tools { } } - It '"adftools_deployment_state" in GP should be created' { + It 'IncrementalDeployment should be ignored when StorageUri is not provided' { + $script:opt.IncrementalDeploymentStorageUri = "" + $script:opt.IncrementalDeployment = $true + Publish-AdfV2FromJson @params + Should -Invoke -CommandName Set-StateToStorage -Times 0 + } + It '"adftools_deployment_state" should be created in storage' { + $script:opt.IncrementalDeploymentStorageUri = $script:uri + $script:opt.IncrementalDeployment = $true Publish-AdfV2FromJson @params - Should -Invoke -CommandName Set-GlobalParam -Times 1 + Should -Invoke -CommandName Set-StateToStorage -Times 1 } It 'New GP "adftools_deployment_state" should exist' { - Write-Host ($gp | ConvertTo-Json -Depth 10) -BackgroundColor DarkBlue - $script:ds1 = $gp.adftools_deployment_state.value + Write-Host ($dstate | ConvertTo-Json -Depth 10) -BackgroundColor DarkBlue + $script:ds1 = $ds } It '"adftools_deployment_state" should contain empty "Deployed"' { $ds1.Deployed | Should -BeNullOrEmpty @@ -118,15 +172,17 @@ InModuleScope azure.datafactory.tools { $ds1.Algorithm | Should -BeExactly 'MD5' } - It 'After redeployment of 1 object "adftools_deployment_state" should contain "Deployed" with 1 item' { + It 'After redeployment of 2 objects "adftools_deployment_state" should contain "Deployed" with 2 items' { Write-Host "*** DEPLOY FIRST TIME ***" -BackgroundColor DarkGreen Copy-Item -Path "$SrcFolder" -Destination "$TmpFolder" -Filter "BlobSampleData.json" -Recurse:$true -Force + Copy-Item -Path "$SrcFolder" -Destination "$TmpFolder" -Filter "LS_AzureKeyVault.json" -Recurse:$true -Force Publish-AdfV2FromJson @params - $ds2 = $gp.adftools_deployment_state.value + #$ds2 = $gp.adftools_deployment_state.value + $ds2 = $dstate Write-Host ($ds2 | ConvertTo-Json -Depth 5) -BackgroundColor Green $ds2.Deployed | Should -Not -BeNullOrEmpty - $ds2.Deployed.Count | Should -Be 1 - Should -Invoke -CommandName New-AzResource -Times 1 + $ds2.Deployed.Count | Should -Be 2 + Should -Invoke -CommandName New-AzResource -Times 2 } It 'After redeployment: no deployment for untouched object' { @@ -140,12 +196,12 @@ InModuleScope azure.datafactory.tools { Remove-Item -Path "$TmpFolder" -Include "BlobSampleData.json" -Recurse:$true -Force $opt.DeleteNotInSource = $true Publish-AdfV2FromJson @params - $ds3 = $gp.adftools_deployment_state.value - #$ds3.Deployed.Count | Should -Be 0 - $ds3.Deployed | Should -BeNullOrEmpty + #$ds3 = $gp.adftools_deployment_state.value + $ds3 = $dstate + Write-Host ($ds3 | ConvertTo-Json -Depth 5) -BackgroundColor Green + $ds3.Deployed.Count | Should -Be 1 } } - } diff --git a/test/Publish-AdfV2FromJson-2.Tests.ps1 b/test/Publish-AdfV2FromJson-2.Tests.ps1 index fdbbbf7..54c7d05 100644 --- a/test/Publish-AdfV2FromJson-2.Tests.ps1 +++ b/test/Publish-AdfV2FromJson-2.Tests.ps1 @@ -66,10 +66,11 @@ InModuleScope azure.datafactory.tools { -DataFactoryName "$DataFactoryName" ` -Location "$Location" -Option $opt } - It 'New GP "adftools_deployment_state" should exist' { - $f = Get-AzDataFactoryV2 -ResourceGroupName $t.ResourceGroupName -DataFactoryName $t.DataFactoryName - $f.GlobalParameters.Keys.Contains("adftools_deployment_state") | Should -Be $true - } + # This is no longer valid as new version keep state in Storage, not in ADF + # It 'New GP "adftools_deployment_state" should exist' { + # $f = Get-AzDataFactoryV2 -ResourceGroupName $t.ResourceGroupName -DataFactoryName $t.DataFactoryName + # $f.GlobalParameters.Keys.Contains("adftools_deployment_state") | Should -Be $true + # } It 'Should run successfully even when no Global Params are in target (exists) ADF' { Publish-AdfV2FromJson -RootFolder "$RootFolder" ` @@ -79,6 +80,36 @@ InModuleScope azure.datafactory.tools { } } + Describe 'Publish-AdfV2FromJson' -Tag 'Integration', 'IncrementalDeployment' { + + BeforeEach { + $VerbosePreference = 'Continue' + Mock Deploy-AdfObject { + param ($obj) + if ($obj.Type -eq 'factory') { + if ($obj.Body.properties.globalParameters | Get-Member -MemberType NoteProperty -Name 'adftools_deployment_state') { + $ds = (Get-Content -Path "test\misc\adftools_deployment_state.json" -Raw -Encoding 'utf8') | ConvertFrom-Json + for ($i = 1000; $i -lt 3001; $i++) { + Add-ObjectProperty -obj $ds -path "Deployed.pipeline$i" -value "00000000000000000000000000000000" + } + $obj.Body.properties.globalParameters.adftools_deployment_state.type = "object" + $obj.Body.properties.globalParameters.adftools_deployment_state.value = $ds + Save-AdfObjectAsFile -obj $obj + } + Deploy-AdfObjectOnly -obj $obj + } + } + } + It 'Should deploy successfully even big size of global parameters' { + $opt.StopStartTriggers = $false + Publish-AdfV2FromJson -RootFolder "$RootFolder" ` + -ResourceGroupName "$ResourceGroupName" ` + -DataFactoryName "$DataFactoryName" ` + -Location "$Location" -Option $opt + } + + } + Describe 'Publish-AdfV2FromJson' -Tag 'Integration', 'IncrementalDeployment' { BeforeEach { @@ -99,6 +130,7 @@ InModuleScope azure.datafactory.tools { It 'Should disable and delete trigger when TriggerStartMethod = KeepPreviousState' { Remove-Item -Path "$RootFolder\trigger\*" -Filter "*.json" -Force $opt.DeleteNotInSource = $true + $opt.StopStartTriggers = $true $opt.TriggerStopMethod = 'DeployableOnly' $opt.TriggerStartMethod = 'KeepPreviousState' Publish-AdfV2FromJson -RootFolder "$RootFolder" ` diff --git a/test/Test-AdfCode.Tests.ps1 b/test/Test-AdfCode.Tests.ps1 index 3461c6b..5a641be 100644 --- a/test/Test-AdfCode.Tests.ps1 +++ b/test/Test-AdfCode.Tests.ps1 @@ -60,9 +60,16 @@ InModuleScope azure.datafactory.tools { Describe 'Test-AdfCode' -Tag 'Unit' { - It 'Should not throw an error when wrong path to DF is provided' { + It 'Should throw an error when wrong path to DF is provided' { $DataFactoryName = "nullPathFactory" $RootFolder = Join-Path -Path $PSScriptRoot -ChildPath $DataFactoryName + { + $script:res = Test-AdfCode -RootFolder $RootFolder -ConfigPath $null + } | Should -Throw + } + It 'Should not throw an error when correct path to DF is provided but ConfigPath is null' { + $DataFactoryName = "adf2" + $RootFolder = Join-Path -Path $PSScriptRoot -ChildPath $DataFactoryName { $script:res = Test-AdfCode -RootFolder $RootFolder -ConfigPath $null } | Should -Not -Throw diff --git a/test/TestHelper/TestHelper.psm1 b/test/TestHelper/TestHelper.psm1 index e2515a8..27c4a74 100644 --- a/test/TestHelper/TestHelper.psm1 +++ b/test/TestHelper/TestHelper.psm1 @@ -215,11 +215,13 @@ function Get-TargetEnv { [String] $AdfOrigName ) + $rootPath = Get-RootPath $target = @{ ResourceGroupName = 'rg-devops-factory' DataFactoryOrigName = $AdfOrigName DataFactoryName = "" Location = "UK South" + SrcFolder = "$rootPath\$AdfOrigName" } $c = Get-AzContext $guid = $c.Subscription.Id.Substring(0,8) diff --git a/test/Triggers_template.ps1 b/test/Triggers_template.ps1 index 2d0975b..36982a7 100644 --- a/test/Triggers_template.ps1 +++ b/test/Triggers_template.ps1 @@ -1,5 +1,5 @@ -$file = Join-Path $RootFolder "trigger" "$triggerName.json" +$file = (Join-Path -Path $RootFolder -ChildPath "trigger") | Join-Path -ChildPath "$triggerName.json" #The function below doesn't execute mocked functions (Get-AzDataFactoryV2Trigger), so we have to call them directly #Publish-TriggerIfNotExist -Name $triggerName -FileName $file @script:CommonParam #begin Publish-TriggerIfNotExist diff --git a/test/misc/adftools_deployment_state.json b/test/misc/adftools_deployment_state.json new file mode 100644 index 0000000..aa5effb --- /dev/null +++ b/test/misc/adftools_deployment_state.json @@ -0,0 +1,37 @@ +{ + "LastUpdate": "2023-07-10T17:04:48.8551772Z", + "Deployed": { + "credential.credential1": "568A15318596B19347BADDD8093AB54A", + "linkedService.LS_DataLakeStore": "80C469974F511111BC8525F748940363", + "linkedService.LS_AzureSqlDb_AW2014_SqlAuth": "ACC3183845ED96786BAA1B8A2CA9909F", + "linkedService.BlobSampleData": "30DF2FF4F1F9CBFF0B4222104B83815A", + "trigger.TR_TumblingWindow": "3657D1F21C1DF2ADC49B8510559794FA", + "pipeline.PL_StoredProc": "A01CA1D66D19064E0E3545D6A9E9F4C1", + "pipeline.PL_Wait_Dynamic": "15D5FAEBCB8E471EFB5DA1DD5CB416CE", + "dataflow.MovieDemo": "A754194A57DDCE73641A084DFC8F4EEF", + "dataset.taxi_trip_data_input": "657B874F01BEC4B750380ACD83A19912", + "dataset.TaxiDemoDayStatsSink": "93FB9990EF000AC14B88A15ABA52C445", + "pipeline.MovieDemoPipeline": "10274B3366BF6FEAF9CA973381B8A46D", + "dataflow.TaxiDemo": "E2325AA4198C2B4E1DF68D0A31377472", + "linkedService.LS_AzureKeyVault": "9283E57914E9002ADADE0A035A4A93DE", + "pipeline.Currency Converter": "329C7A4BD22FB0377EF77C10A1D2F5E6", + "dataset.movie_dataflow_sink": "13FF043939EA9CE6B29E8A692177EDCF", + "linkedService.LS_AzureDatabricks": "75E8C1A5F0396EDF1A0B85EBB9E6DF35", + "dataset.taxi_trip_fare_input": "8F8B88C2154B6BB5AD722F9BA5D67E49", + "trigger.TR_RunEveryDay": "347CE2AE8D04523AA92F302B864F0E90", + "dataset.movie_dataflow_source": "B70E2EF65DB64ED2E782FA2835F82DB1", + "dataset.CurrencyDatasetUSD": "8485541DF554D78D3CE7943662E74169", + "pipeline.PL_Wait5sec": "39E26D9DBA96F34D3696657B037297DB", + "pipeline.Multiple Waits": "1A1BC01009571B269B12C03F8BBA296E", + "dataset.TaxiDemoTotalByPaymentType": "E92C422B2A098309E00E07995AA5E9FD", + "pipeline.TaxiDemo": "3D81FF7DB6F57B6112B9FDE8BD8CBC6C", + "dataset.CADOutput1": "B463B6A652BDDBAF57BC1CB7F75D8862", + "trigger.TR_AlwaysDisabled": "32F51DF65DB66E4A0D30A90CE00E4225", + "dataset.CurrencyDatasetCAD": "B055E7F3BEDC9D13369C1A3508451851", + "dataset.USDOutput": "59CD41DECAFBCAE87AA90364E7D51F6F", + "dataflow.Currency Converter": "50007B5E6C6845E90776BEF595A24B9E", + "dataset.TaxiDemoVendorStatsSink": "1C6BA8594C092FAA4397F38B4B9151FF" + }, + "adftoolsVer": "1.6.000", + "Algorithm": "MD5" +} \ No newline at end of file