Wednesday, March 3, 2010

Using Powershell in a SQL Server 2008 job step


I have been using Powershell since I first got wind of the beta and realized that all of my Windows Scripting Host skills would become obsolete. 

However, I've never had the opportunity to try out the Powershell job step type for SQL Agent jobs until today. I know that the Powershell that "comes with SQL Server" isn't the same as the regular Powershell but, other than some fiddling around with the SQL Server file system provider, I've bumped up against those differences before. It's just different enough to be frustrating, so I'm blogging about it. It's a bit of a rant, but maybe it will save someone some time or someone can point out something that I've overlooked.


My goal is to have a simple script that I can use in a SQL Agent job that can automatically discover and backup any SSAS database that exists on the same computer that as the SQL Agent. (This is an improvement over my old method, which used hard-coded XMLA queries to backup specific databases. This is OK if you aren't adding SSAS databases very often, and will remain my default method on SQL Server 2005 servers because SQL Server 2005 doesn't support Powershell in job steps. I suppose that I might be able to code up something using VBScript and COM, but it's boring and I don't have much call for it.)


Hunting around the web, I found an interesting script authored by Idera. Idera's script isn't exactly what I wanted and it is designed to be called from a Powershell shell/console, so I learned how they did it by reading their script and writing my own custom script. It's not rocket science. In the process of getting my script working, I ran into a few quirks of how the Powershell job steps work, and I figured that it was worth a blog entry.


Issue #1:

The first interesting thing is that you can't just use write-host "Hello, World". This is because the SQL Agent doesn't implement a console. There are details on that here. For simple text output to the job step output log, use something like write-output "Hello, World" or write-output $host.


I like to use a liberal sprinkling of write-verbose in my scripts. It provides a good troubleshooting tool and, if I provide enough write-verbose statements in my script, I find that I rarely have to resort to write-debug or a real debugger with breakpoints when I have problems with a script.


The thing is that write-verbose doesn't work in a job step. In other words, this won't work:

$verbose = $true

write-verbose "this is verbose"


This won't work, either:

$verbose = $true

write-verbose "this is verbose" -verbose:$verbose


So, to sum up, write-host, write-verbose, write-debug and their ilk are out. Those commands want to write to a console, and there is no console when running under a job step.


Issue #2:

The next thing was that the assemblies that I wanted to use are not pre-loaded by the environment. That doesn't seem like such a bad thing.


A job step received an error at line 5 in a PowerShell script. The corresponding line is '[Microsoft.AnalysisServices.Server] $server = New-Object ([Microsoft.AnalysisServices.Server]) '. Correct the script and reschedule the job. The error information returned by PowerShell is: 'Unable to find type [Microsoft.AnalysisServices.Server]: make sure that the assembly containing this type is loaded.


My solution for that LoadWithPartialName().

Issue #3:

Inside of a job step, Powershell seems fussier about pointing to enumerations. For example, this doesn't work in a job step but it does run in regular Powershell 2.0:

foreach ($d in $server.Databases) { Do-Stuff }

But this does work:

$databases = $server.Databases

foreach ($d in $databases) { Do-Stuff }

Issue #4:

There is no developer support for developing Powershell scripts inside of SSMS. Basically, you just get a simple edit control to type or paste your script into. You can't set break points or watch variables, you don't get IntelliSense or even syntax highlighting. There is no "Parse" to check your syntax. This matches what you get with the "ActiveX" scripting step. 

The lack of debugging support lead me to many iterations of what we old-timers used to call "debug by printf" or "debug by lprint", now resurrected as "debug by write-output" and cycles of edit-save-run-fail-look.at.output.log. 

I could have developed the script in notepad++ and then copy-and-pasted it into the job step, but then the syntax quirks would pop up. It is a lot easier to fix one syntax quirk if you add one line at a time than if you try to add a whole 30 line script at one go, particularly if the syntax rules are not quite what you are used to.

Issue #5:

The "syntax error" messages can be misleading. The little editor window that SSMS provides does not show line numbers. If you copy-and-paste the code into an editor, it seems that the line numbers given for errors do not correspond to the location of problematic lines within the script.

Issue #6:

An innocent-looking thing like the following line causes a syntax error:

# $($causeAnError)

I think that the SQL Agent things that there is some sort of an embedded job variable in there, even though the line is commented out. If you remove the first $ and the parenthesis, the syntax error disappears.

Issue #7:

It seems that it is harder to concatenate ad-hoc strings together in code. So, instead of this:

foreach ($d in $databases ) {

write-output "$($d.Name)"
}

I find myself doing things like this:

foreach ($d in $databases ) {

$msg = "{0}" -f $d.Name
write-output $msg
}

Issue #8:

I could not call Get-Date or [Datetime]::Now. I don't know why; they are two different things that would do the same thing: get me the current time, which is nice to have in auditing/logging output.


FWIW, my final script looks like the following. There are no national secrets here, but there are two caveats:

  • Watch out for any word-wrap
  • I'm still testing this, so I wouldn't rely on it for a production system without vetting and testing it first.

Here is the script:

$msg = "Starting "

write-output $msg


# The SSAS assembly isn't loaded for you

[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.AnalysisServices") | Out-Null


# create an instance of the backup object

[Microsoft.AnalysisServices.BackupInfo] $serverBackup = New-Object ([Microsoft.AnalysisServices.BackupInfo])


# create an instance of the Server object and connect to the local, default SSAS instance

[Microsoft.AnalysisServices.Server] $server = New-Object ([Microsoft.AnalysisServices.Server])

$server.Connect(".")


# dump out some stuff to show that we are actually connected

write-output $server


# where do the backups go?

$BackupDestination = $server.ServerProperties.Item("BackupDir").Value

if (-not $backupDestination.EndsWith("\")) {

$backupDestination += "\"
}

write-output "The backup destination folder is '$BackupDestination'."


# Set the standard attributes of our backups.

$serverBackup.AllowOverwrite = 1

$serverBackup.ApplyCompression = 1

$serverBackup.BackupRemotePartitions = 1


$databases = $server.Databases

$msg = "There are {0} databases to back up." -f $databases.Count

write-output $msg

foreach ($d in $databases) {

$serverBackup.file = "{0}{1}.abf" -f $backupDestination, $d.name
$msg = "Backing up '{0}' to '{1}'" -f $d.Name, $serverBackup.file
write-output $msg
$d.Backup($serverBackup)
}

$server.Disconnect()


$msg = "Ending "

write-output $msg