Problem
Changes in tables, views, store procedures, and user defined functions are done routinely in SQL Server. Is it possible to use a tool like Azure DevOps in SQL Server to do this task?
Solution
The approach proposed below provides a secure, centralized storage for all database code. This enables quick access and recovery in case of data loss or corruption, in both a computer folder directory and in Azure DevOps. Keeping a history of modifications allows for tracking changes with ease. Regularly backing up database code and versioning it in Git within DevOps is crucial for safeguarding the SQL coding evolution.
Azure DevOps
Open Azure DevOps in your browser. Choose a Collection and click +New Project.
Name your project and type a description, and click Create.

In your new project, choose Repos and click the “Copy push commands to clipboard” icon.

File Explorer
Open the file explorer and choose a location of your preference and create a new folder. In my case, I called it the same name as my project: SQL_SCRIPTS.
Right-click the created folder name and choose Open in Terminal. Or open PowerShell, command cd directory, and hit Enter. Execute the code below changing the path value and the last line with the value of push commands that you copied above.
cd "path\to\SQL_Scripts"
git init
git add .
git commit -m "Initial commit"
git remote add origin https://xxxx/xxx/SQL_Scripts/_git/SQL_Scripts
Now your git folder is created.
PowerShell
Using a text editor, create a file named SriptAll.ps1 in your backup folder; in my case, the SQL_Scripts folder. Change the path for the variable $mainOutputDirectory.
The following code is equivalent to using SQL Server Management Studio (SSMS) and right-clicking on a database name and choosing Tasks > Generate Scripts…
I deliberately decided not to git any system database or a database names that contain Azure or Report words, excluding the AzureDevOps and ReportServer databases.
Import-Module SQLPS -DisableNameChecking
$server = New-Object Microsoft.SqlServer.Management.Smo.Server "DAIO"
$mainOutputDirectory = "path\to\SQL_Scripts"
if (!(Test-Path -Path $mainOutputDirectory)) {
New-Item -ItemType Directory -Path $mainOutputDirectory
}
$scripter = New-Object Microsoft.SqlServer.Management.Smo.Scripter ($server)
$scripter.Options.ScriptDrops = $false
$scripter.Options.WithDependencies = $false
$scripter.Options.Indexes = $true
$scripter.Options.Triggers = $true
$scripter.Options.FullTextIndexes = $false
$scripter.Options.DriAll = $true
foreach ($db in $server.Databases) {
if ($db.Name -match "(?i)azure|report") {
Write-Output "Skipping database: $($db.Name)"
continue
}
if ($db.IsSystemObject -eq $false) {
Write-Output "Processing database: $($db.Name)"
$dbOutputDirectory = Join-Path -Path $mainOutputDirectory -ChildPath $db.Name
if (!(Test-Path -Path $dbOutputDirectory)) {
New-Item -ItemType Directory -Path $dbOutputDirectory
}
$objectTypes = @("Tables", "Views", "StoredProcedures", "UserDefinedFunctions")
foreach ($type in $objectTypes) {
foreach ($object in $db.$type) {
if ($object.IsSystemObject -eq $false) {
$outputFile = Join-Path -Path $dbOutputDirectory -ChildPath "$($type)@$($object.Schema)_$($object.Name).sql"
$script = $scripter.Script($object)
$script | Out-File -FilePath $outputFile -Force -Encoding UTF8
}
}
}
foreach ($synonym in $db.Synonyms) {
$outputFile = Join-Path -Path $dbOutputDirectory -ChildPath "$($synonym.Schema)_$($synonym.Name).sql"
$script = $scripter.Script($synonym)
$script | Out-File -FilePath $outputFile -Force -Encoding UTF8
}
}
}
Set-Location $mainOutputDirectory
Set-Variable GCM_PROVIDER=generic
git add .
$FileName = get-date -format "yyyyMMddHHmm"
git commit -m $FileName
git push -u origin master
Once this file is created, right-click it and choose Run with PowerShell. This will take some time to process.

Once finished, your DevOps Sql_Scripts is pushed to DevOps.

Now, you only need to execute the ScriptAll.ps1 file on PowerShell to update your repository data.
Next Steps