I have an Azure Devops repo that looks for merges into master that contain *.sql files and will initiate a build pipeline that downloads the sql files along with any dependencies to my on-prem agents and iterates over those objects contained within each file(s) to automate script backups from production. The powershell script that runs on-prem creates a BACKUP directory inside my artifact if one does not exist. How can I use git to then push those objects (aka my SQL backups) back into the master branch? One thing I am struggling to wrap my head around is that this build is fired off of a merge into master so if I am also doing another merge from my build agent, is that bad practice to overlap files? To clarify, the initial PR will not contain the BACKUPS directory with my sql backup files, but rather new sql scripts and or modifications submitted by the user.
#Only execute build pipeline when merge to master branch contains .sql file additions/modifications
trigger:
branches:
include:
- master
paths:
include:
- '*.sql'
pool:
vmImage: windows-latest
jobs:
- job: get_changed_files
pool:
name: 'DBDevOps_Stage'
steps:
#Pulls down any sql files that have been added or modified along with any dependency files such as .ps1 scripts
- task: PowerShell@2
inputs:
targetType: 'inline'
script: |
$targetfolder = "$(Build.StagingDirectory)" + "/"
write-host "target folder: $(Build.StagingDirectory)"
function CopyFiles{
param( [string]$source )
$target = $targetfolder + $source
New-Item -Force $target
copy-item $source $target -Force
}
$changes = git diff --name-only --relative --diff-filter=AM HEAD^ HEAD *.sql
write-host "printing changes variable:"
$changes
write-host "Change variable ended."
if ($changes -is [string]){ CopyFiles $changes }
else
{
if ($changes -is [array])
{
foreach ($change in $changes){
CopyFiles $change
}
}
}
$PowershellFiles = git ls-files *.ps1
if ($PowershellFiles -is [string]){ CopyFiles $PowershellFiles }
else
{
if ($PowershellFiles -is [array])
{
foreach ($PowershellFile in $PowershellFiles){
CopyFiles $PowershellFile
}
}
}
$DLLFiles = git ls-files *.dll
if ($DLLFiles -is [string]){ CopyFiles $PowershellFiles }
else
{
if ($DLLFiles -is [array])
{
foreach ($DLLFile in $DLLFiles){
CopyFiles $DLLFile
}
}
}
$ConfigFiles = git ls-files *Config.csv
if ($ConfigFiles -is [string]){ CopyFiles $ConfigFiles }
else
{
if ($ConfigFiles -is [array])
{
foreach ($ConfigFile in $ConfigFiles){
CopyFiles $ConfigFile
}
}
}
- task: PublishBuildArtifacts@1
inputs:
pathToPublish: $(Build.StagingDirectory)
artifactName: MyChangedFiles
#Downloads above files/artifacts to the Default Working Directory
#Powershell scripts iterate over files and backup scripts to a BACKUPS directory using SMO
- job: backup_sql_files
dependsOn: get_changed_files
pool:
name: 'DBDevOps_Stage'
steps:
- checkout: self
clean: true
- task: DownloadBuildArtifacts@0
displayName: 'Download Build Artifacts'
inputs:
artifactName: MyChangedFiles
downloadPath: $(System.DefaultWorkingDirectory)
- task: PowerShell@2
inputs:
targetType: 'filePath'
#SMODependencyChecker.ps1 must be executed first. This script will then call PowershellBackupSQLScripts.ps1 once the dependencies are loaded.
filePath: '$(System.DefaultWorkingDirectory)/PowershellScripts/SMODependencyChecker.ps1'
Arguments: "-FilePath $(System.DefaultWorkingDirectory)"
My repo:
On-Prem working directory w/ new backup files: