dbops v0.5 – new features

Yet another release of dbops Powershell module is finally out in the wild! A fresh v0.5.1 version is available now and can be downloaded from PSGallery and GitHub. A brief list of features:

  • MySQL support
  • PostgreSQL support
  • Cross-platform ad-hoc query command: Invoke-DBOQuery
  • Full Linux & PS Core support
    • Compatible with .Net Standard 2.0
    • Module is now tested on both Windows and Linux
  • Missing connectivity libraries can now be downloaded and installed interactively
  • Core deployment libraries – DbUp – were upgraded to v.4.2.0
  • Hundreds of new tests for every supported RDBMS:
    • SQL Server
    • Oracle
    • MySQL
    • PostgreSQL
  • New connection string parser to support all database types that allows for deep customization

Ad-hoc queries via Invoke-DBOQuery

New function Invoke-DBOQuery provides a cross-platform querying experience for all four supported RDBMS. Some of the most notable features:

  • Highly customizable connection parameters
  • Running queries from files or variables
  • JSON configuration files and native dbops configs are supported
  • Interactive console: -Interactive (or simply -i)

Some real-life examples:

# Get current database from a PostgreSQL server
Invoke-DBOQuery -Type PostgreSQL -Server localhost:5432 -Database postgres -Query 'SELECT current_database()' -As SingleValue
# Run a query against MySQL database
Invoke-DBOQuery -Type MySQL -Server localhost -Database mysql -Query 'SELECT Name from Users'
# Run multiple files from a folder in a specific SQL Server database
Get-ChildItem .\Scripts | Invoke-DBOQuery -SqlInstance 'SqlServer1:1443' -Database MyDB
# Connect to an Oracle database interactively as SYSDBA
Invoke-DBOQuery -Type Oracle -SqlInstance localhost -Interactive -Username sys -Password $mySecurePassword -ConnectionAttribute @{ 'DBA Privilege' = 'SYSDBA' }

MySQL and PostgreSQL support

The module now fully supports interactions with MySQL and PostgreSQL, as well as continues to support SQL Server and Oracle. You can choose which connection library to use by specifying the -Type parameter.

# install a DBOps package to the MySQL server
Install-DBOPackage -Type MySQL -Server localhost -Database mydb -Path .\package.zip
# invoke a query against a remote PostgreSQL server
Invoke-DBOQuery -Type PostgreSQL -SqlInstance pgsql1:5432 -Database mydb -Query 'SELECT current_database();'
# change the default connection -Type (SQLServer) to PostgreSQL:
Set-DBODefaultSetting -Name rdbms.type -Value 'PostgreSQL'

Linux & PS Core support

The module is fully compatible with Linux (and probably MacOS? I have no way to verify) and can be used on Powershell Core 6.1 and higher.

All the existing unit and integration tests now run on both Linux and Windows.

Automating SQL Server patching

Howdy everyone, today I’m gonna talk about SQL Server patching.

While this operation is pretty straightforward and very rarely causes issues by itself, there is nothing less entertaining than going from server to server, RDP to RDP, launching the same executable over and over. Which opens an automation opportunity for us!

Let me introduce you a new tool from the dbatools toolset: Update-DbaInstance. It will take care of SQL Server patching – and all the associated with it activities. It supports updating to latest or specific version of SQL Server, allows you to restart the computer after the upgrade is finished, and, most importantly, you can run it against multiple servers in parallel.

Prerequisites

Elevated session

This command is designed to be run in an elevated session. It is a must for the local installations, when you’re running the command against a localhost, but, strangely enough, it’s also required when you’re running it against remote machines and you patch repository is on a network drive (which I’m guessing is the case 99% of the time). Why? See the next prerequisite:

Powershell remoting and Credentials

Unsurprisingly, performing remote patching through Powershell requires Powershell remoting to be already configured. There is one more requirement, though, for running sessions remotely: if your patch repository is a network folder, there is no way default remoting session would be able to get them. It would run into the infamous double-hop issue and fail to delegate your current credentials when accessing that folder.

The solution to that is to pass credentials to the remote server, which would authenticate you on that remote server properly. This is why, specifying explicit -Credential parameter will help you in such scenarios. In such doing, you are implicitly switching to the CredSSP authentication protocol which allows you to securely pass credentials to the remote host.

Using CredSSP protocol without configuring it

The command will attempt then to configure CredSSP protocol on both client and server – this is why the elevated session is required even on the client machine. Such dynamic connection configuration is only supported on Powershell 3.0 or higher: both client and server should comply with this requirement.

Properly configured CredSSP with Credentials passed through the remote session

If for any reason, CredSSP configuration and/or connection has failed, the command will ask you if you want to try using a less secure protocol (Default) to pass credentials to the remote server.

With all that said, if the server is able to reach the patch file without authentication, you won’t need to specify credentials at all.

Patch repository

You’ll need to have you SQL Server update files stored somewhere where the server can reach them: a network drive or a local folder. The command would find relevant KB based on execution parameters and look for the file name recursively in specified path(s).

Important thing about the patch repository is that all the KB files should maintain their original naming, which in all cases follows a predefined file name template:

SQLServer*-KB*-*x*.exe

Having such file stored in the folder will ensure that the update command finds it among the files in the folder, but keep in mind that folders with huge number of files might slow down the search. You can specify this folder by using the -Path parameter.

Executing the command

There are a few ways you can execute the command and most of them are covered in the examples you can get from running Get-Help Update-DbaInstance -Examples

Never forget about the power of Get-Help

Some of the features worth mentioning:

  • By default the command will try to update the server to the latest-and-greatest service pack and cumulative update
  • You can specify -Type to choose between installing service packs, cumulative updates or both
  • -Restart will restart the computer after the installation is successful. Very helpful when installing multiple updates at once – each of them will require a restart. It will also restart the computer prior to installing the updates in case a pending restart is already queued in the system.
  • -Version will allow you to target a specific version of SQL Server on the machine, as well as define the update level you want to reach. Some of the examples:
    • SQL2008SP4 – upgrade all SQL2008 instances to SP4
    • 2012SP3CU11 – upgrade all SQL2012 instances to SP3 CU11
    • 2016 – upgrade all SQL2016 instances to the latest version available
    • SP4 – upgrade all instances found on the computer to SP4, regardless of their version
  • Specifying more than one computer in -ComputerName will initiate a parallel update once all the update paths have been discovered and confirmed
  • This command is considered as a high-risk command, which means it would ask you to confirm each update path before executing it. To disable that behavior use -Confirm:$false
  • It can be a good idea to run the command in the -WhatIf mode first to assess the impact of it.

A simple execution of the command will go through the following phases:

  • Search for existing installations of SQL Server
  • Check if there are any eligible updates for each chosen installation
  • Configure CredSSP for remote connections
  • Search for the required SQL Server KB file(s) in the repository path defined in -Path
  • Ask user’s permission to continue with the installation
  • Extract and install the update(s): service pack, then cumulative update.
  • Restart the computer if needed and -Restart is specified
Update process in all its glory

Conclusion

This command is designed to turn a mundane patching process into a simple task, which now can be performed often enough to keep up with the pace of SQL Server cumulative update releases. Hope you’ll find a use for it in upcoming 2019.

Merry Christmas and Happy New Year!

New features in dbops 0.4.3

A new version of dbops (0.4.3) was released, which introduced a few brand-spanking-new configuration and QoL features:

  • Reset-DBODefaultSetting – resets all custom configurations made by a user to their original values defined by the module.
  • Export-DBOConfig – you can now export configurations from packages and configuration objects into a json file and use it later for deployment.
  • Send-DBOMailMessage – sends an informative email after the deployment is done. Supports custom email templates and configurations.
  • New-DBOConfig – creates a new configuration object with current defaults. Previously, this functionality was covered by Get-DBOConfig, which was slightly confusing.

Some examples on how to use all the new features:


# Generating config files and using it later as a deployment template
New-DBOConfig -Configuration @{ DeploymentMethod = 'SingleTransaction'; SqlInstance = 'devInstance'; Database = 'MyDB' } | Export-DBOConfig '.\dev.json'
Get-DBOConfig -Path '.\dev.json' -Configuration @{ SqlInstance = 'prodInstance' } | Export-DBOConfig '.\prod.json'
Install-DBOPackage Deploy.zip -ConfigurationFile .\dev.json
Install-DBOPackage Deploy.zip -ConfigurationFile .\prod.json
# Extract current config from a package
Get-DBOPackage Deploy.zip | Export-DBOConfig '.\prod.json'
# Use existing configuration from a package and apply it to a different package
$config = Get-DBOPackage Deploy.zip | Get-DBOConfig
$config.Database = 'MyOtherDB'
MyOtherDeploy.zip | Update-DBOConfig -Configuration $config


#Get current configuration
Get-DBODefaultSetting
#Reset a particular configuration
Reset-DBODefaultSetting -Name ConnectionTimeout
#Reset all configurations to defaults
Reset-DBODefaultSetting -All

view raw

dbops_reset.ps1

hosted with ❤ by GitHub


#Send notification after the deployment is completed using default configuration settings
Set-DBODefaultSetting -Name mail.SmtpServer -Value 'smtp.local'
Set-DBODefaultSetting -Name mail.To -Value 'DBA@local'
Set-DBODefaultSetting -Name mail.From -Value 'dbops@local'
Install-DBOPackage Deploy.zip -SqlInstance MYServer | Send-DBOMailMessage

Hopefully, these changes will help you to fine-tune your deployments better.

Happy Halloween!

Introducing dbops Powershell module – DevOps for databases

DevOps for databases

It’s been a long while since DevOps has become a hot topic, yet every time we talk about DevOps for databases, every solution tend to force user into some kind of framework, which usually has a set of rules binding them to a certain way of writing the scripts, deploying the code, and even following a specific naming convention. All of this only exists to control the flow of the deployment internally, but makes it more of a black box to the end-users: us.

Ever since I tried to deploy a set of scripts using infamous Invoke-SqlCmd (that was a terrible idea, by the way), I felt there was a gap in community-driven Powershell world, a SQL code deployment module. A module, that would be simple to use, enables users to implement Continuous Deployment and Continuous Integration scenarios, and will be absolutely transparent in the ways it works.

Eventually, in my searches, I stumbled upon the DbUp project: a pure .Net cross-platform deployment solution, which, however, required a fair amount of coding prior to starting using it. And more often than not, only .Net developers would risk venturing into the complications of the internal classes, building a deployment solution as a part of their application. The idea behind the project, however, was perfect: a fully customizable deployment library that can be wrapped into any kind of code and be used the way you like it. After some testing, I realized that it was exactly what I needed: an open-source engine, on which I can build a 100% customizable Powershell SQL code deployment module.

Continuous Integration and Continuous Deployment

DbUp performs deployments using so called migrations: sequential transformations of the database schema, that eventually leads to a fully deployed state. The scripts are supposed to be in a single folder and be properly named to maintain the deployment order. To maintain the deployment integrity, DbUp engine uses the database versioning approach, where each script is supposed to be stored inside the schema versions table, which tracks the progress of the deployment and prevents the migrations from being executed twice.

But what if people are re-using same scripts when deploying the database? Or the development process already has a defined naming convention and consists of multiple similarly named scripts and/or folders? I wanted the module to be able to deploy literally any kind of solution, as long as the scripts are simple plain-text SQL files. And not only deploy them, but also enable CI/CD operations for that code!

With that in mind, I came up with an idea of Builds, mimicking the application deployment pipeline: the code is going through a build phase first, where it’s being packed into a deployment package, ensuring the same deployment experience across the enterprise. To enable the databases to follow that same path, the module is using Builds that represent a set of scripts that bring the database to a specific database state after a migration.

DbOps package
dbops builds deployed onto databases with different states

Each build consists of one or more scripts deployed in a predefined order; once all the scripts from the first build are deployed, the next-in-line build will be picked up for the deployment. Each script is recorded to a schema versions table inside the deployment database ensuring that the same script would never be executed twice.

Such approach opens doors to CI/CD scenarios, where any following version of the package can be deployed on top of any database state, even though it’s just good old plain-text SQL code behind the scenes. It also enables users to use any file names and any deployment order as much as they need.

Since all previous SQL files are stored inside the package, it is quite easy to use that catalog to add new SQL files to it, as there is now way to distinguish new files in the source code folder, which enables users to leverage CI/CD features. The only catch here is that now the package would have to be stored somewhere so that it could be easily retrieved during next build phase.

 

ci-cd-flow2
dbops flow

DBOps module

The module takes care of all the aspects of the deployment, as well as grants full control over the deployment process. The code is distributed under MIT license and was recently made available on Github:  https://github.com/sqlcollaborative/dbops

To install this module run the following statement to get it directly from PSGallery:

Install-Module dbops

As of the date of this post, the module supports two major RDBMS: SQL Server and Oracle, but since DbUp supported platforms list is huge, it’s only a matter of time (and demand!) until all the other platforms are supported.

Basic help is available on the front page, however, the documentation is still in a work-in-progress state. Same goes for the module itself: as much as its features are now more prominent, its development is far from over and there is a whole set of features planned for future releases, such as: rollbacks, tSQLt integration, sqlcmd/sqlplus scripts parsing, etc.

Command list

Deployment

  • Install-DBOSqlScript – script deployments from a set of files/folders
  • Install-DBOPackage – deployments of the previously built packages

Package management

  • New-DBOPackage – creates a new package from an existing set of scripts
  • Get-DBOPackage – returns the package object with a multitude of fine-tuning scenarios
  • Add-DBOBuild – adds a new build to the existing package
  • Remove-DBOBuild – removes a build from the existing package

Configuration management

  • Get-DBOConfig – loads configuration from a package or a configuration file
  • Update-DBOConfig – updates configuration of the package
  • Get-DBODefaultSetting – get a list of default deployment settings
  • Set-DBODefaultSetting – modify default deployment settings

CI/CD pipelining capabilities

  • Invoke-DBOPackageCI – integrate recent changes in the code into the package using incremental versioning

Artifact management

  • Copy-DBOPackageArtifact – copy a specified version of the package from the repository
  • Get-DBOPackageArtifact – retrieve a path to the specified version of the package from the repository
  • Publish-DBOPackageArtifact – publish package into the repository

Cross-platform support

  • Install-DBOSupportLibrary – download and install 3-rd party libraries for other database types, if needed

Thank you for support

Open-source projects are fueled by many people working towards the same goal. I really appreciate the support of the #dbatools team who made this project possible. Special thanks to:

  • @mobileck – for testing and on-point feedback
  • @cl – for dbatools groove, mentoring and bright ideas
  • @sqldbawithbeard – for Test-Driven Development mentality
  • @FredWeinmann – for showing me the ways of PSFramework thus improving multiple aspects of this module

 

Using dacpac packages to clone SQL Server databases

Many SQL Server DBAs know that there is a very convenient way of delivering a full copy of the database structure to a different SQL Server: it’s called dacpac, or a Data-tier Application package. You can extract the database any time using SQL Server Management Studio context menu: Tasks -> Extract Data-tier Application, which would guide you through the extraction options and output a compressed package file that would contain all necessary schema definitions, as well as table data. This package can be further on deployed on a completely different server using a Deploy Data-tier Application context menu item.

What people do not widely know, however, is how much fine-tuning a simple deployment might require, and how difficult it is sometimes to force the deployment operation to behave in a certain way. Good thing, is that there is now a convenient way of automating this process using dacpac-related functions from the dbatools Powershell module.

This article would explain how to:

  • Extract dacpac from a database
  • Choose whether you want to include data or not
  • Deploy the dacpac package on any version of SQL Server higher than 2008 SP2
  • Exclude object types from the deployment process
  • Generate deployment scripts
  • Create a Jenkins job that will take care of this whole process for you

Extracting dacpac package

Let’s use this simple script to extract dacpac from an existing database:


Param (
$Server,
$Database,
$Path,
[switch]$IncludeData
)
# Prepare extraction properties string
$exportProperties = "/p:IgnorePermissions=True /p:IgnoreUserLoginMappings=True" # Ignore permissions definition
if ($IncludeData) {
$exportProperties += " /p:ExtractAllTableData=True" #Extract data
}
Export-DbaDacpac -SqlInstance $Server -Database $Database -Path C:\temp -ExtendedProperties $exportProperties

Export properties here are defined to control the extract process:

  • Ignore database permissions, as there is rarely a need to add permissions into the mix
  • Define an option to include data into the package

Export-DbaDacpac function will take care of the extraction process and will return all details about the extraction, including the .Path property that would direct us to the extracted file:

2018-08-24_15-01-10
output from Export-DbaDacpac

You can check all other available properties (which are basically command line parameters) in a corresponding article: https://msdn.microsoft.com/en-us/library/hh550080.aspx

Deploying dacpac package

There is one important note about dacpac deployment: it requires a so called Publish Profile for the deployment. Publish profile is an xml file that contains deployment parameters that you want to enable during the deployment. You can create such profile by using one of the following options:

  • Run New-DbaPublishProfile, specifying the connection parameters. The resulting file will enable you to run the deployment, but it is highly recommended that you modify it according to your needs
  • From Visual Studio SSDT project, select Project -> Publish. It would open a dialog, that allows you to load, save and modify the Publish profile.
  • Manually, using documentation from Publish parameters section of the same article

The parameters that we’re going to use during deployment are as follows:

  • AllowIncompatiblePlatform – would enable me to deploy to an earlier version of SQL Server.
  • CreateNewDatabase – will re-create the database every time
  • ExcludeObjectTypes – semicolon-separated list of objects that will be excluded from the deployment.
  • ExcludeLogins and ExcludeUsers – somehow, these two ended up being independent parameters, not included in the previously mentioned ExcludeObjectTypes. Excludes logins and/or users from the deployment.
  • IgnorePermissions – ignores object permissions during the deployment.
  • DropObjectsNotInSource – boolean value, which is True by default, that controls whether existing objects in the database are going to be dropped if they are not present in the package. This is a very important parameter if you are planning to deploy dacpac to a non-empty database.
  • IgnoreRoleMembership – ignores role membership

This is an example of the Publish profile file that we’re going to use in this exercise:


<?xml version="1.0" ?>
<Project ToolsVersion="14.0" xmlns="http://schemas.microsoft.com/developer/msbuild/2003"&gt;
<PropertyGroup>
<TargetConnectionString></TargetConnectionString>
<ProfileVersionNumber>1</ProfileVersionNumber>
<AllowIncompatiblePlatform>True</AllowIncompatiblePlatform>
<CreateNewDatabase>False</CreateNewDatabase>
<ExcludeObjectTypes>Permissions;RoleMembership</ExcludeObjectTypes>
<ExcludeLogins>True</ExcludeLogins>
<IgnorePermissions>True</IgnorePermissions>
<DropObjectsNotInSource>False</DropObjectsNotInSource>
<IgnoreUserSettingsObjects>True</IgnoreUserSettingsObjects>
<IgnoreLoginSids>True</IgnoreLoginSids>
<IgnoreRoleMembership>True</IgnoreRoleMembership>
</PropertyGroup>
</Project>

view raw

publish.xml

hosted with ❤ by GitHub

Once we have a Publish profile ready we can start the deployment by using Publish-DbaDacpac function, specifying:

  • SqlInstance – target server
  • Database – target database
  • Path – path to the dacpac package
  • PublishXml – path to the publish profile

2018-08-24_15-44-08.jpg
just like this

Other parameters of Publish-DbaDacpac that you might find useful:

  • GenerateDeploymentScript – will generate a deployment script prior to deployment
  • ScriptOnly – will not perform the deployment, generating a deployment script instead

Full script

This script below would extract the dacpac from an existing database and deploy it to a different server using the publish.xml file we created above. Feel free to modify it to your own liking!


[Cmdletbinding()]
Param (
$SourceServer,
$TargetServer = $SourceServer,
$SourceDatabaseName,
$TargetDatabaseName,
$Path,
$PublishXml = '.\publish.xml',
[switch]$IncludeData
)
#Stop on any error by default
$ErrorActionPreference = 'Stop'
# Construct export parameters
$exportProperties = "/p:IgnorePermissions=True /p:IgnoreUserLoginMappings=True"
if ($IncludeData) {
$exportProperties += " /p:ExtractAllTableData=True"
}
#Export database to path
Write-Verbose "Starting the export from $SourceServer.$SourceDatabaseName to $Path"
$exportFile = Export-DbaDacpac -SqlInstance $SourceServer -Database $SourceDatabaseName -Path $Path -ExtendedProperties $exportProperties -EnableException
Write-Verbose "Export completed`: $exportFile"
#publish dacpac with defined publish xml file
Write-Verbose "Starting the publication from $($exportFile.Path) to $TargetServer.$TargetDatabaseName"
$xml = (Get-Item $PublishXml).FullName
Publish-DbaDacpac -SqlInstance $TargetServer -Database $TargetDatabaseName -PublishXml $xml -Path $exportFile.Path -EnableException
#remove dacpac file
if (Test-Path $exportFile.Path) {
Write-Verbose "Removing dacpac file $($exportFile.Path)"
Remove-Item $exportFile.Path
}

Creating a Jenkins job

In order to create an interface and a scheduler at the same time, I would definitely recommend to make this scenario available on your Jenkins instance, same way we did it before in Refresh database & keep permissions – SQL Server Jenkins labs (Part 2). The parameters are going to be almost the same, so feel free to use examples from the Jenkins labs to create the job that will use instead a dacpac deployment script.

2018-08-24_15-55-30
sample Jenkins job

Conclusion

Dacpac packages are a convenient way of copying a full database when other methods are unavailable for whatever reason. It has some learning curve associated with it, but it can be a great help when used in an automated manner.

It is also a perfect workaround for database copying, when a source server has a higher SQL Server version than the target server: this will work even between 2017 and 2008, if all the objects are compatible with the target database version.

It also work pretty well as an alternative for a snapshot replication, being much less restrictive in its approach.

UPDATE: Updated the CreateNewDatabase parameter with a new description, apparently it recreates the database every time instead of just creating a new database if needed.

ZipHelper Powershell module

I’ve recently published ziphelper Powershell module on PSGallery, due to my permanent frustration about the existing Expand-Archive/Compress-Archive commands in Powershell 5.1. No, they work fine, but when there’s a need to work with one single item out of the whole archive, there is no other option except to extract/compress everything at once.

This new module works on a per-item basis, introducing 4 new commands:

  • Add-ArchiveItem – adds one or more items to the archive with an option to specify an InnerFolder within the archive where the files will be stored.
  • Expand-ArchiveItem – extracts one or more archived files from the archive.
  • Remove-ArchiveItem – removes one or more items from the archive.
  • Get-ArchiveItem – returns a list of details about items in the archive. When returning a specific item, will also return its contents as binary.

The module is using native Zip streams and comes without any 3-rd party libraries. Feel free to download it from PSGallery:

Install-Module ziphelper

Refresh database & keep permissions – SQL Server Jenkins labs (Part 2)

Welcome back to the SQL Server Jenkins labs! In the first part of the labs we created a Jenkins job that restored a database based on historical data. This time, let’s try something very similar, but add some flavor to it.

We’re going to setup a job that copies a single database in its current state from one server to another. It is very similar to the previous lab exercise, except, this one will take care of much more things at once. Let’s add into the mix most of the activities that accompany the database restore, such as:

  • Changing database owner after restore is completed;
  • Cleaning up existing permissions on the restored database;
  • Restoring previous permissions, in case the database is being replaced with a fresh copy.

On top of that, let’s make some of them optional, increasing the flexibility of the solution.

Writing the code

Once again, there is no reason not to use dbatools commands for this purpose, similarly how most of us wouldn’t want to reinvent the wheel. The scenario is going to be quite simple:

  1. Define parameters – connections, database names, paths, etc.
  2. Backup the source database to a network share with COPY ONLY
  3. Export permissions from the target database if needed
  4. Restore the target database
  5. Change database owner
  6. Remove all users from the database if needed
  7. Re-apply old permissions if needed

The restore process, when it comes to replacing an existing database, is a very delicate operation, so let’s make sure that the code would stop whenever an error is occurred. This will be achieved by using -EnableException switch of the dbatools functions, that will throw an error instead of a simple warning, which is a default behavior. Let’s also define $ErrorActionPreference = 'Stop' to ensure that the script is not executed any further once the error has occurred. Lastly, it would be a good idea to save existing permissions to the disk drive in case a manual intervention is required.

Most of the output will be hidden in the Verbose stream, which allows us to control the amount of output we want to see on our end. Just add a -Verbose switch, if you need to see more!

Here is the script we are going to use this time. It has slightly more action inside, but it still consists of the same building blocks, most of which are regular dbatools commands. Let’s name it Copy-Database.ps1:

[Cmdletbinding()]
Param (
    [Parameter(Mandatory)]
    [string]$SourceServer,
    [Parameter(Mandatory)]
    [string]$SourceDatabase,
    [string]$TargetServer,
    [Parameter(Mandatory)]
    [string]$TargetDatabase,
    [Parameter(Mandatory)]
    [string]$BackupFilePath,
    [string]$DBOwner,
    [switch]$KeepPermissions,
    [switch]$DropUsers
)

#Stop on any error by default
$previousErrorActionPreference = $ErrorActionPreference
$ErrorActionPreference = 'Stop'

#Temporary permissions will be stored here
$permissionsFile = ".\Permissions-$TargetDatabase.sql"

#If target server is not specified, it's assumed that it's going to be the same server
if (!$TargetServer) { $TargetServer = $SourceServer }

if (Test-Path $BackupFilePath) {
    #Removing old temporary backup if it still exists for some reason
    Write-Verbose "Removing old backup file $BackupFilePath"
    Remove-Item $BackupFilePath
}
#Run copy-only backup
Write-Verbose "Initiating database backup`: $SourceServer.$SourceDatabase to $BackupFilePath"
$backup = Backup-DbaDatabase -SqlInstance $SourceServer -Database $SourceDatabase -BackupFileName $BackupFilePath -CopyOnly -CompressBackup -Checksum -EnableException

if (!$backup.BackupComplete) {
    throw "Backup to $BackupFilePath was not completed successfully on $SourceServer.$SourceDatabase"
}

#Record and store permissions
if ($KeepPermissions -or $DropUsers) {
    $permissions = Export-DbaUser -SqlInstance $TargetServer -Database $TargetDatabase
    Write-Verbose "Exported permissions from $TargetServer.$TargetDatabase`: $permissions"
    Write-Verbose "Storing permissions of $TargetServer.$TargetDatabase in a file $permissionsFile"
    $permissions | Out-File -FilePath $permissionsFile
}

#Restore the database on top of existing DB
Write-Verbose "Initiating database restore`: $BackupFilePath to $TargetServer.$TargetDatabase"
$backup | Restore-DbaDatabase -SqlInstance $TargetServer -DatabaseName $TargetDatabase -WithReplace -ReplaceDbNameInFile -EnableException

#Update database owner, stop throwing errors at this point, as further actions are not critical
$ErrorActionPreference = $previousErrorActionPreference

if ($DBOwner) {
    Write-Verbose "Updating database owner to $DBOwner"
    Set-DbaDatabaseOwner -SqlInstance $TargetServer -Database $TargetDatabase -TargetLogin $DBOwner
}

#Drop users if requested
if ($DropUsers) {
    $users = Get-DbaDatabaseUser -SqlInstance $TargetServer -Database $TargetDatabase -ExcludeSystemUser
    foreach ($user in $users) {
        Write-Verbose "Dropping user $($user.Name) from $TargetServer.$TargetDatabase"
        try {
            $user.Drop()
        }
        catch {
            # No need to throw, maybe a user owns a schema of its own
            Write-Warning -Message $_
        }
    }
}

#Restore permissions
if ($KeepPermissions) {
    Write-Verbose "Restoring permissions of $TargetServer.$TargetDatabase from a file $permissionsFile"
    Invoke-DbaSqlQuery -SqlInstance $TargetServer -Database $TargetDatabase -Query $permissions
}

#Remove backup file
if (Test-Path $BackupFilePath) {
    Write-Verbose "Removing backup file $BackupFilePath"
    Remove-Item $BackupFilePath
}

#Remove permissions file
if (Test-Path $permissionsFile) {
    Write-Verbose "Removing permissions file $permissionsFile"
    Remove-Item $permissionsFile
}

Creating a new Jenkins job

Configuration check

Let’s re-iterate on this and ensure that our Jenkins instance has all necessary tools to implement the job we have in mind.

Plugins list

The following Jenkins plugins are going to be used in this job:

Agent setup

Ensure that the Jenkins executor(s), which you intend to use, runs under an account that was granted with sysadmin privileges on both source and target SQL Servers. Make sure to confine the job to that particular executor(s) using labels.

Also let’s ensure that executor node(s) has the latest version of dbatools installed:

Install-Module dbatools

Finally, let’s copy our Copy-Database.ps1 script to the local folder using the same exact path across all the Jenkins nodes, and ensure that job executor has r/w access to that folder. If you want to use a remote folder (file share), make sure that your Execution Policy is set to RemoteSigned on each node: Set-ExecutionPolicy -ExecutionPolicy RemoteSigned.

For advanced Jenkins users I would recommend using a Source Control Management repository to store your code and retrieve it using the SCM section in the job definition.

Creating the job

Finally, it’s time to create the job! Start with clicking ‘New’ item on the left-hand side of the Jenkins window:

2018-08-03_10-42-58

Input a new project name and make sure to choose a Freestyle project, which is a most generic Jenkins job type:

2018-08-03_10-45-07

In the job configuration window, set the project to be parameterized (click the corresponding checkbox). Add 4xString parameters (server and database names for both source and destination), and 2xBoolean parameters (those will represent our optional switches):

2018-08-03_10-48-22

Parameter name of each of the added parameters will become an environment variable, so make sure to not to use special characters in the parameter name.

2018-08-03_10-47-42

Provide a descriptive definition for the parameters:

  • String parameter 1
    • Name: SourceServer
    • Default value:
    • Description: Full server name of the source instance
    • Trim the string: Yes
  • String parameter 2
    • Name: SourceDatabase
    • Default value:
    • Description: Name of the source database to copy
    • Trim the string: Yes
  • String parameter 3
    • Name: TargetServer
    • Default value:
    • Description: Full server name of the target instance
    • Trim the string: Yes
  • String parameter 4
    • Name: TargetDatabase
    • Default value:
    • Description: Name of the target database. Will be replaced, if exists.
    • Trim the string: Yes
  • Boolean parameter 1
    • Name: KeepPermissions
    • Default value: Unchecked
    • Description: Check if you want to re-apply all the previously existed permissions on top of the restored database
  • Boolean parameter 2
    • Name: DropUsers
    • Default value: Unchecked
    • Description: Check if you want to drop all the users from the target database upon restore.

If you want this project to run only on specific agent nodes, click Restrict where this project can be run and provide space-separated list of labels in the field:

2018-08-03_11-00-22

Now we have our job parameterized and running only on specific nodes, the only thing left is to pass those parameters to the Powershell script. In the Build section click Add build step and select Windows PowerShell:

2018-08-03_12-00-12

In the Command section let’s put a call to our script file, that should be already accessible for the agent nodes, and pass all the parameters from your job:

& D:\Scripts\Copy-Database.ps1 -SourceServer $env:SourceServer `
  -TargetServer $env:TargetServer `
  -SourceDatabase $env:SourceDatabase `
  -TargetDatabase $env:TargetDatabase `
  -KeepPermissions:([bool]::Parse($env:KeepPermissions)) `
  -DropUsers:([bool]::Parse($env:DropUsers)) `
  -BackupFilePath "\\Some\Path\$($env:SourceDatabase).bak" `
  -DBOwner sa `
  -Verbose

Several things to note about this script:

  • $env:ServerName is a call for an environment variable, that was defined by the job parameter.
  • [bool]::Parse($env:KeepPermissions) is used to convert the job’s parameter to [bool]. All job parameter are strings in essence and boolean parameters will have one of the following values: ‘true‘ or ‘false‘. We’re simply converting them to [bool]  using a public method Parse of the [bool] class.
  • -DropUsers:([bool]::Parse($env:DropUsers)) is used to pass a boolean value to a switch parameter, which requires a colon notation.

Save the job by pressing the Save button on the bottom of the screen.

 Running the job

Let’s start the job now! Click Build with Parameters from the job overview screen:

2018-08-03_12-46-20.jpg

Input all the parameters and start the job:

2018-08-03_12-50-12.jpg

In the Build History section of the page, you will see the progress of the job. By clicking on it you would be able to see the details, such as Console Output:

2018-08-03_12-52-47

Conclusion

There are multiple ways of introducing automation to your environment, and Powershell is definitely one of them, yet, sometimes it’s hard for people to embrace automation without a simple UI that would help them to do the first step.

Jenkins is one of such tools that enables anyone to use automated solutions with little to no knowledge of the code that is executed behind the curtain. It can be just a frontend for your Powershell scripts, or it can become an ultimate tool in your hands that controls multiple aspects of your day-to-day tasks, and is integrated with your whole enterprise.

Cheers! And stay tuned – there is more to come!

Restore database job – SQL Server Jenkins labs (Part 1)

Hello there, and welcome to SQL Server Jenkins labs!

Some time ago I talked about using Jenkins as a robust scheduling tool, and now it is absolute time to follow up on that, and demonstrate some of the solutions that can be achieved by using a combination of Powershell, dbatools and Jenkins.

The first example to go down that pipe is going to be a simple backup restore job. The job is going to get backup history from the source server, build a proper restore sequence and perform the restore on a server of your choosing. The restore step would also:

  • Change file paths of the database files to a default data/log directory
  • Rename database files to match the new database name
  • Optionally, perform a point-in-time recovery

Preparations

First, let’s ensure that our Jenkins instance has all necessary tools to implement this job.

Plugins list

The following Jenkins plugins are going to be used in this job:

You can setup plugin by clicking Manage Jenkins on the left-hand side menu and choosing Manage Plugins in the list. On the PluginManager page, switch to the Available tab and use Filter textbox to search for the plugins you want to install.

2018-08-03_14-15-21.jpg

After you selected the plugins, click Install without restart. Most of the plugins do not require restart, however, some of them will not work properly until the whole Jenkins instance is restarted.

Agent setup

Ensure that the Jenkins executor(s), which you intend to use, runs under an account that was granted with sysadmin privileges on both source and target SQL Servers. Depending on your agent type, it can be configured in Manage Jenkins -> Manage Nodes (Windows service agent) or in the Services console on your server (Java agent).

If you have more than one agent, make sure that all of the agents are properly configured, otherwise set up agent Labels that would allow you to confine the job to that particular executor(s).

Also let’s ensure that executor node(s) has the latest version of dbatools installed:

Install-Module dbatools

Creating a new Jenkins job

Start with clicking ‘New’ item on the left-hand side of the Jenkins window:

2018-08-03_10-42-58

Input a new project name and make sure to choose a Freestyle project, which is a most generic Jenkins job type:

2018-08-03_14-24-52.jpg

In the job configuration window, set the project to be parameterized (click the corresponding checkbox). Add 5 String parameters by clicking the Add Parameter button; these parameters would represent server and database names for both source and destination databases, as well as a restore time parameter for point-in-time recovery:

2018-08-03_14_48-22

Parameter name of each of the added parameters will become an environment variable, so make sure to not to use special characters in the parameter name.

2018-08-03_10-47-42

Provide a descriptive definition for the parameters:

  • String parameter 1
    • Name: SourceServer
    • Default value:
    • Description: Full server name of the source instance
    • Trim the string: Yes
  • String parameter 2
    • Name: SourceDatabase
    • Default value:
    • Description: Name of the source database to copy
    • Trim the string: Yes
  • String parameter 3
    • Name: TargetServer
    • Default value:
    • Description: Full server name of the target instance
    • Trim the string: Yes
  • String parameter 4
    • Name: TargetDatabase
    • Default value:
    • Description: Name of the target database. Will be replaced, if exists.
    • Trim the string: Yes
  • String parameter 5
    • Name: ToDate
    • Default value:
    • Description: Specify if you want the database to be restored to a certain time, or leave blank to retrieve the latest backup
    • Trim the string: Yes

If you want this project to run only on specific agent nodes, click Restrict where this project can be run and provide space-separated list of labels in the field:

2018-08-03_11-00-22

Now we have our job parameterized and running only on specific nodes, let’s add a Powershell build step, that would represent the actual restore process. In the Build section of the job creation page click Add build step and select Windows PowerShell:

2018-08-03_12-00-12

In the Command textbox of the new Windows Powershell item insert the following code:

$p = @{
    SqlInstance = $env:TargetServer
    Database = $env:TargetDatabase
}
if ($env:ToDate) {
    $p += @{ RestoreTime = $env:ToDate}
}

Get-DbaBackupHistory -SqlInstance $env:SourceServer -Database $env:SourceDatabase | Restore-DbaDatabase @p -WithReplace -ReplaceDbNameInFile -TrustDbBackupHistory

 

This code will call two procedures from the dbatools module:

  • Get-DbaBackupHistory – retrieve backup history from the source server. Parameters:
    • SqlInstance – source SQL Server name
    • Database – source database name, which would be used to search for backup records
  • Restore-DbaDatabase – out of all the backup history records, restore the most relevant files, starting with full and followed by diff and log backups. Parameters:
    • SqlInstance – target SQL Server name
    • Database – desired database name on the target server
    • RestoreTime – will restore the database to the specified date/time. This parameter is added dynamically, based on whether the ToDate job parameter was specified.
    • WithReplace – replace the target database if it exists
    • ReplaceDbNameInFile – replace the part of the filename that contains the source database name with a new name
    • TrustDbBackupHistory – skip reading backup headers from the files, relying instead on the information from the source server

The parameters of the job are persisted through the execution as environment variables, which is why in order to use them we’ll have to use $env:VariableName notation.

Save the job by pressing the Save button on the bottom of the screen.

Running the job

Let’s start the job now! Click Build with Parameters from the job overview screen:

2018-08-03_14-45-02.jpg

Input all the parameters and start the job:

2018-08-03_14-47-07

In the Build History section of the page, you will see the progress of the job. By clicking on it you would be able to see the details, such as Console Output:

2018-08-03_14-49-02

Conclusion

In this example we used very few moving parts, which, however, enabled us to create a simple and effective tool that, from now on, is going to save us hours of time when it comes to restoring databases.

Stay tuned for the future updates!

Job schedule at a glance

I often asked myself: can some of the SQL Server jobs I have be the reason of database slowness users are currently experiencing? On multiple occasions that was indeed the case and I had to reschedule some of them, because there were too many running at the same time. For example, re-indexing and full backups should really never run together, because that creates a huge strain on the server (and also makes your differential backups bigger). In order to see the clear picture of the jobs that are running on the server during the week, I have created a report that helped me to bring order (and justice) to my job schedules.

I present to you Job Schedule Matrix:

2018-01-12_8-55-02

This is an SSRS report that utilizes matrix object to display the frequency of the jobs running on your environment. It also comes with an ETL script to extract data from the SQL Servers and upload it to the database of your choice. It also can support other job types as long as you write your own ETL script and specify appropriate Server Type.

Download it from here: https://github.com/nvarscar/schedulematrix-report

It might be a little slow when you tell it to show all of your jobs from a multitude of servers, but otherwise works relatively quick. There is also a chart-based version that looks slightly more ugly, but overall is even faster.

Installation instructions are in the readme, and let me know if you have any problems with it!

PS if you are using different server types (other than SQL Server), make sure to publish instance names even if there is no such thing, as this is what you will eventually see in the report body.

Managing SQL Server login passwords with #dbatools

Just had a fantastic use case for one of the dbatools commands, Get-DbaLogin.

Imagine that you need to change SQL password for a single login on all of the production machines in your environment. You could write a query and run it on multiple servers, for that though you would have to have all of your production servers registered under the same Registered Servers folder in SSMS. If that’s not the case – you’ll have to run it multiple times. Let’s see how easy it is to do it in Powershell console:

#using CMS - getting all the servers with 'prod' in the urn path
$servers = (Get-DbaRegisteredServer -SqlInstance MyCMSInstanceName | Where { $_.Urn -match 'prod' } ).ServerName
#or manually: $servers = 'server1,server2,server3'.Split(',')
#Get login objects from each server
$logins = Get-DbaLogin -SqlInstance $servers -Login accountname
#Set passwords
foreach ($l in $logins) { $l.ChangePassword('MySecureP@ssw0rd'); $l.Alter(); Write-host "Changed password on" $l.SqlInstance }
#Check status
$logins.Refresh()
$logins | Select SqlInstance, DateLastModified, IsLocked

What if some of them became locked because of the apps hitting the servers in the background constantly? SSMS in this case would force you to change the password when unlocking the account (or disabling password policy). If we checked ChangePassword method, we will see that there are several overrides among them and we can change the passwords again and unlock the accounts at the same time:

$logins[0].ChangePassword

OverloadDefinitions
-------------------
void ChangePassword(string newPassword)
void ChangePassword(securestring newPassword)
void ChangePassword(string oldPassword, string newPassword)
void ChangePassword(securestring oldPassword, securestring newPassword)
void ChangePassword(string newPassword, bool unlock, bool mustChange)
void ChangePassword(securestring newPassword, bool unlock, bool mustChange)

Let’s ensure that all the accounts are unlocked and the password is not required to be changed:

foreach ($l in $logins) { $l.ChangePassword('MySecureP@ssw0rd', $true, $false); $l.Alter(); Write-host "Changed password on" $l.SqlInstance }

Enjoy!