Anyone who has used sqlio knows that it's output files are not particularly readable. I used to manually run sqlio tests, then retype the output into a spreadsheet so that I could more easily analyze the data. This proved time-consuming, error-prone and boring, so I wrote a script that parses all of the data that I wanted from the sqlio output file. This post provides that code, and some background on how I use it. I will return to my series on my Powershell set up next time.
The first thing to do is to run your tests with sqlio and get it's results into files. This is left as an excercise to the reader. I have a .cmd file that I use to do this. I use a .cmd file because most of the machines that I test do not have Powershell installed on them. I am including this file in the SQLIO.scripts.zip file.
The reason that I break the running of the tests and the parsing of the output into two seperate processes is to decouple them. It can take quite a while for sqlio to run, especially if you are running tests at multiple block sizes or varying between random and sequential or read and write. The idea is to get all the data first, then I can parse at my leisure. If something goes wrong with the parsing, I don't have to rerun my entire test. This also allows me to tweak the parsing script as I go along, without having to re-run the data collection phase.
Running tests tedious to do. If all you want to do is try the parsing script, I have included some sample output that you can use in the sqlio.results.zip file.
The second thing to do is to run the script that parses a raw output file and store the parsed data in a Powershell object.
That command might look something like this:
$result = ls "*.txt" | %{Parse-SQLIOOutput -file $_}
The last thing that we need to do is to analyze the data. Storing the data in a Powershell object allows us to use all of Powershells features for producing reports, or to funnell the data into another tool for analysis.
For example, we can use the where keyword to restrict the output or we could store the data in an Excel file or a SQL Server table. A command to look at the data on a command line might look something like this:
$result | sort Drive,RandomOrSequential,ReadOrWrite,BlockSize,QueueDepth | format-list *
Or we could just sort it and send it to Microsoft Excel: $result | sort Drive,RandomOrSequential,ReadOrWrite,BlockSize,QueueDepth | Out-Excel
That's all there is to my script. I hope that you download the files and give it a try.
It is worth mentioning that Linchi Shea has a perl script that does more-or-less the same thing as my script. I haven't tried it myself, but Linchi does good stuff and you should definitely check it out if you are into perl.
Ashdar Partners specializes in Performance Engineering (and re-engineering) of Microsoft SQL Server database systems.
Friday, May 30, 2008
Wednesday, May 28, 2008
My Powershell set up, Part 2 of 2
Last time, I explained that my profile consists of several files, with various functions (and associated aliases) broken out by functionality.
Each of these "subprofiles" is loaded when my standard Powershell profile runs.
. "$env:winbin\psh\profile.Security.ps1"
. "$env:winbin\psh\profile.RemoteService.ps1"
. "$env:winbin\psh\profile.SQLServer.ps1"
. "$env:winbin\psh\profile.dstrait.ps1"
. "$env:winbin\psh\profile.$($env:userdomain).ps1"
The last line will load a client-specific file, depending on the credentials that I am currently using.
Some of the functions in these files are mine and some of them were borrowed from various web sites. When I borrow a function, I add a small comment header that describes where the function came from. This would allow me to go back and look for more interesting scripts from that author, or to look for improved versions or allow me to give credit where credit is due. Some of these functions are heavily modified versions of such originals, some have been modified so much that they are probably unrecognizable.
Here is a listing of functions, grouped by Filename. For my next posting in this series, I will pick one of these functions and explain what I use it for and how it works.
profile.dstrait.ps1
-------------------
Change-VolumeLabel
Convert-WMIDateToSystemDate
Find-File
Get-DiskUsage
Get-Guihelp
Get-Hardware
Get-LiveHelp
Get-LogicalDisk.
Get-Range
Get-RecyclerSize
Get-RemoteDrives
Get-Uptime
Manage-Computer
Out-Excel
Ping-Computer
prompt
Report-ODBCDriver
Report-ODBCSystemDSN
Should-Process
tree
Watch-Command
whoami
profile.Legion.ps1
------------------
Archive-SQLAgentJob
Archive-SQLDatabase
Archive-SQLDTSPackage
Archive-SQLStoredProcedure
Copy-SQLDTSPackage
Document-Ticket
Export-WinbinToLegion.line
Fix-SQLAgentJobStep
Fix-SQLDatabaseOwner
Get-DomainSQLServerList.line
Get-Ticket
Report-LatestBackups
Report-SQLAgentJobsWithNonDefaultOwners
Report-SQLServerRolesWithUnusualMembers
Report-SQLServerSpaceUsage
ScriptOut-SQLAgentJob
profile.RemoteService.ps1
-------------------------
List-RemoteService
Pause-RemoteService
Start-RemoteService
Stop-RemoteService
profile.Security.ps1
--------------------
Add-ComputerGroupMember
Check-IsDomainComputer
Check-IsDomainGroup
Check-IsDomainUser
List-ComputerGroupMembers
List-DomainComputers
List-DomainGroupMembers
List-DomainGroupsForUser
Remove-ComputerGroupMember
Search-ActiveDirectory
profile.SQLServer.ps1
---------------------
Execute-SQLAgentJob
Execute-SQLDBCCUpdateUsage
Execute-SQLQuery
Get-SQLServerList
Get-SQLStatus
Get-SQLVirtualNodes
Invoke-SQLQuery
Out-SQLDatabase
Query-ServerCatalog
Report-SQLAgentJob
Report-SQLAgentJobHistory
Report-SQLDatabase
Report-SQLDatabaseObject
Report-SQLDatabaseTable
Report-SQLDTSPackage
Report-SQLLinkedServer
Report-SQLOfflineDatabase
Report-SQLServerMemory.line
Report-SQLServerSearch
Each of these "subprofiles" is loaded when my standard Powershell profile runs.
. "$env:winbin\psh\profile.Security.ps1"
. "$env:winbin\psh\profile.RemoteService.ps1"
. "$env:winbin\psh\profile.SQLServer.ps1"
. "$env:winbin\psh\profile.dstrait.ps1"
. "$env:winbin\psh\profile.$($env:userdomain).ps1"
The last line will load a client-specific file, depending on the credentials that I am currently using.
Some of the functions in these files are mine and some of them were borrowed from various web sites. When I borrow a function, I add a small comment header that describes where the function came from. This would allow me to go back and look for more interesting scripts from that author, or to look for improved versions or allow me to give credit where credit is due. Some of these functions are heavily modified versions of such originals, some have been modified so much that they are probably unrecognizable.
Here is a listing of functions, grouped by Filename. For my next posting in this series, I will pick one of these functions and explain what I use it for and how it works.
profile.dstrait.ps1
-------------------
Change-VolumeLabel
Convert-WMIDateToSystemDate
Find-File
Get-DiskUsage
Get-Guihelp
Get-Hardware
Get-LiveHelp
Get-LogicalDisk.
Get-Range
Get-RecyclerSize
Get-RemoteDrives
Get-Uptime
Manage-Computer
Out-Excel
Ping-Computer
prompt
Report-ODBCDriver
Report-ODBCSystemDSN
Should-Process
tree
Watch-Command
whoami
profile.Legion.ps1
------------------
Archive-SQLAgentJob
Archive-SQLDatabase
Archive-SQLDTSPackage
Archive-SQLStoredProcedure
Copy-SQLDTSPackage
Document-Ticket
Export-WinbinToLegion.line
Fix-SQLAgentJobStep
Fix-SQLDatabaseOwner
Get-DomainSQLServerList.line
Get-Ticket
Report-LatestBackups
Report-SQLAgentJobsWithNonDefaultOwners
Report-SQLServerRolesWithUnusualMembers
Report-SQLServerSpaceUsage
ScriptOut-SQLAgentJob
profile.RemoteService.ps1
-------------------------
List-RemoteService
Pause-RemoteService
Start-RemoteService
Stop-RemoteService
profile.Security.ps1
--------------------
Add-ComputerGroupMember
Check-IsDomainComputer
Check-IsDomainGroup
Check-IsDomainUser
List-ComputerGroupMembers
List-DomainComputers
List-DomainGroupMembers
List-DomainGroupsForUser
Remove-ComputerGroupMember
Search-ActiveDirectory
profile.SQLServer.ps1
---------------------
Execute-SQLAgentJob
Execute-SQLDBCCUpdateUsage
Execute-SQLQuery
Get-SQLServerList
Get-SQLStatus
Get-SQLVirtualNodes
Invoke-SQLQuery
Out-SQLDatabase
Query-ServerCatalog
Report-SQLAgentJob
Report-SQLAgentJobHistory
Report-SQLDatabase
Report-SQLDatabaseObject
Report-SQLDatabaseTable
Report-SQLDTSPackage
Report-SQLLinkedServer
Report-SQLOfflineDatabase
Report-SQLServerMemory.line
Report-SQLServerSearch
Wednesday, May 21, 2008
My Powershell set up, Part 1 of 2
This posting gives an overview how I have set up Powershell on my workstation.
Any heavy shell user normally customizes his (or her) environment. These environments are (and will likely always be) a work in progress. As you learn things and write additional functions to help you with your tasks, your environment will grow and change. I find that I am tweaking things even as a I write this posting.
Aside: One of the things that I am forever tweaking is function names. I try to follow the Powershell verb-noun convention, but there are lots of spots where I have had to make judgment calls. It's not hard to find inconsistencies. For example, I vacillate between "Report", "List" and "Get". I'm not crazy about Get. My issue with Get is that it seems that Get should return an object and be used like this:
$result = Get-Thingy
$result.MemberFunction $param
$result = $null
whereas I run a lot of stuff that just returns information.
A 'Report' is usually more complicated than a simple 'List', which is often a single column.
What's better "Invoke-SQLQuery" or "Get-SQLResults"?
Anyway, let's get back on track.
Firstly, I am still running Powershell V1.0. I haven't spent any time with the Powershell 2.0 CTP, though I have read a few introductory articles on new features. I have had issues with other Microsoft CTPs in the past and I'm loathe to install something on a 'production' computer that I might have a hard time removing. I am also dismayed with the decision to not support remote execution under Windows XP or Windows Server 2003. None of my clients have deployed Vista on more than a handful of workstations, and none of them have even started looking at Windows Server 2008. They will not being doing so just to run Powershell.
Secondly, I use Powertab. Powertab provides pop-up help for functions and files, just by hitting the tab key.
Thirdly, I use the Powershell Community Extensions (PSCX). PSCX provides lots of small, handy functions.
Lastly, my profile consists of several files, with various functions (and associated aliases) broken out by functionality. I have two reasons for breaking them out. The first reason is that my original profile was getting very large. My main profile is over 79 KB right now. The second reason is that I try to break out stuff that is specific to a client and not useful anywhere else.
Next time, I will go into some detail on how my profile works.
Any heavy shell user normally customizes his (or her) environment. These environments are (and will likely always be) a work in progress. As you learn things and write additional functions to help you with your tasks, your environment will grow and change. I find that I am tweaking things even as a I write this posting.
Aside: One of the things that I am forever tweaking is function names. I try to follow the Powershell verb-noun convention, but there are lots of spots where I have had to make judgment calls. It's not hard to find inconsistencies. For example, I vacillate between "Report", "List" and "Get". I'm not crazy about Get. My issue with Get is that it seems that Get should return an object and be used like this:
$result = Get-Thingy
$result.MemberFunction $param
$result = $null
whereas I run a lot of stuff that just returns information.
A 'Report' is usually more complicated than a simple 'List', which is often a single column.
What's better "Invoke-SQLQuery" or "Get-SQLResults"?
Anyway, let's get back on track.
Firstly, I am still running Powershell V1.0. I haven't spent any time with the Powershell 2.0 CTP, though I have read a few introductory articles on new features. I have had issues with other Microsoft CTPs in the past and I'm loathe to install something on a 'production' computer that I might have a hard time removing. I am also dismayed with the decision to not support remote execution under Windows XP or Windows Server 2003. None of my clients have deployed Vista on more than a handful of workstations, and none of them have even started looking at Windows Server 2008. They will not being doing so just to run Powershell.
Secondly, I use Powertab. Powertab provides pop-up help for functions and files, just by hitting the tab key.
Thirdly, I use the Powershell Community Extensions (PSCX). PSCX provides lots of small, handy functions.
Lastly, my profile consists of several files, with various functions (and associated aliases) broken out by functionality. I have two reasons for breaking them out. The first reason is that my original profile was getting very large. My main profile is over 79 KB right now. The second reason is that I try to break out stuff that is specific to a client and not useful anywhere else.
Next time, I will go into some detail on how my profile works.
Sunday, May 18, 2008
First Post
So, belatedly, I become a blogger.
It's not that I don't like to write. I like to write very much. Too much, really. Anyone who has been the victim of one of my emails or read one of my postings on the old Swynk SQL DBA list can vouch for that.
The primary thing that has always held me back from blogging is a simple question: "What can I say about SQL Server that others have not already said?". There are a wealth of great blogs and articles discussing SQL Server and there is little point in piling on.
The one thing that I know better than any other DBA that I have come across is Powershell. (It's true that I don't get out much.) I have had a few people comment on how quickly I can slap together reports, push changes to databases and do other tasks. A key reason that I can do this is Powershell.
So, we'll push on with powershell for a while and see how the page hits work out. There may be a few side trips along the way, as I have a few useful scripts written in Transact-SQL and in VBScript and my main 'thing' is actually performance engineering.
First a little background. I am a long-time (ten years) SQL Server DBA but, let's face it, Transact-SQL isn't appropriate for everything. In prior lives, I worked as a developer and wrote in a variety of languages (C and various dialects of BASIC being the most famous, PL/1 and HPL possibly being the least famous). Transact-SQL has always felt like a second class language to me. The most obvious places where TSQL falls down are file manipulation, exporting data, error handling, program automation through COM, WMI, use of SQL-DMO, coupling and the like. Sometimes TSQL isn't appropriate just because there isn't any SQL Server in the picture. Using TSQL to write a report of free disk space on a group of servers, for example.
(I have also written a fair amount of code in "batch language", the native shell language of dos and windows. Such scripts are often syntactically mindbending, in a lovecraftian sense, the are hard to debug and should not be mentioned in polite company.)
In the late nineties, I started using VBScript in conjunction with Windows Scripting Host (WSH). This allowed me to write scripts to do things like search directory hierarchies to delete old files, script out SQL Server objects, fix job step output file paths, write change scripts, check disk space and do all sorts of other useful things. WSH is a good way to write small scripts that can manipulate files, perform simple-but-repetive tasks, automate programs through COM. WSH still has it's uses because powershell is not widely deployed yet.
A few years ago, with the advent of .Net and the tendency towards deprication of COM, I started looking around for a new language. I've tried several different things. For me, Jscript isn't any more useful than VBscript. I do not think that I have ever written anything longer than 10 lines long in bash, though I was a long time user of cygwin's bash and I maintain a couple of linux systems at home. I have worked on a few small things in perl, but I just never felt comfortable with it. Lots of smart people use python (and ruby), but it's more non-Microsoft stuff to install on a new computer.
I started using powershell about two years ago. At first, I used it simply as a replacement for the old Windows command shell. As I learned more of the language, I found myself using it more and more with my daily tasks. It was very easy to get started and to increase my skills as time went on and I discovered how to exploit powershell's features. In short order, I was writting scripts. Then I started using WMI, then COM, then .net objects. It's easy to get started and just go along at your own rate.
I'm not going to spend much time cheerleading for powershell. In fact here are the bullet points for my whole speech:
I believe that powershell is the best thing Microsoft has done in a long while.
I think that if you are a SQL Server admin or any sort of network or Active Domain administrator, and you have any tendencies towards scripting (as opposed to point-and-click use of GUI-based admin tools), you should be at least competent in powershell.
If you are working in Microsoft's universe, fighting Microsoft's tide is usually unsuccessful.
Microsoft seems to be trying its' damndest to get you to use powershell.
I am not going to try to teach powershell basics because there are plenty of good resources online. I don't own any powershell books, so I can't really make any suggestions. I'm used to teaching myself things, often under pressure, but a book or books might work well for you.
I am not going to show you lots of gee-whiz syntax or how to write terse and incomprehensible one-liners because I don't see much use in that, as far as trying to teach goes. It's fun as an intellectual exercise. It's pretty easy to turn everything into a one-liner, if you hide it in a function.
What am I going to do? A few examples:
I have a script that can report information on storage (sizes, free space, percent free, etc.) quickly, from a command line. With some powershell trickery, I can easily run this report on one server or dozens. The great thing about this is that the output from each server is coalesced into a single collection of queryable objects. I can dump the output to the screen, or stick it in a Excel file or a table.
I have a script that can run an ad-hoc SQL query on any instance or group of instances I choose. Similarly, I can coalesce the report into a single collection. This script can be fed instance names through a pipeline, so they can be stored in any form of permanent storage.
I intend to split my postings between functions in my profile and scripts that do more complicated things. I also intend to keep the length of posts down to something shorter than this one.
It's not that I don't like to write. I like to write very much. Too much, really. Anyone who has been the victim of one of my emails or read one of my postings on the old Swynk SQL DBA list can vouch for that.
The primary thing that has always held me back from blogging is a simple question: "What can I say about SQL Server that others have not already said?". There are a wealth of great blogs and articles discussing SQL Server and there is little point in piling on.
The one thing that I know better than any other DBA that I have come across is Powershell. (It's true that I don't get out much.) I have had a few people comment on how quickly I can slap together reports, push changes to databases and do other tasks. A key reason that I can do this is Powershell.
So, we'll push on with powershell for a while and see how the page hits work out. There may be a few side trips along the way, as I have a few useful scripts written in Transact-SQL and in VBScript and my main 'thing' is actually performance engineering.
First a little background. I am a long-time (ten years) SQL Server DBA but, let's face it, Transact-SQL isn't appropriate for everything. In prior lives, I worked as a developer and wrote in a variety of languages (C and various dialects of BASIC being the most famous, PL/1 and HPL possibly being the least famous). Transact-SQL has always felt like a second class language to me. The most obvious places where TSQL falls down are file manipulation, exporting data, error handling, program automation through COM, WMI, use of SQL-DMO, coupling and the like. Sometimes TSQL isn't appropriate just because there isn't any SQL Server in the picture. Using TSQL to write a report of free disk space on a group of servers, for example.
(I have also written a fair amount of code in "batch language", the native shell language of dos and windows. Such scripts are often syntactically mindbending, in a lovecraftian sense, the are hard to debug and should not be mentioned in polite company.)
In the late nineties, I started using VBScript in conjunction with Windows Scripting Host (WSH). This allowed me to write scripts to do things like search directory hierarchies to delete old files, script out SQL Server objects, fix job step output file paths, write change scripts, check disk space and do all sorts of other useful things. WSH is a good way to write small scripts that can manipulate files, perform simple-but-repetive tasks, automate programs through COM. WSH still has it's uses because powershell is not widely deployed yet.
A few years ago, with the advent of .Net and the tendency towards deprication of COM, I started looking around for a new language. I've tried several different things. For me, Jscript isn't any more useful than VBscript. I do not think that I have ever written anything longer than 10 lines long in bash, though I was a long time user of cygwin's bash and I maintain a couple of linux systems at home. I have worked on a few small things in perl, but I just never felt comfortable with it. Lots of smart people use python (and ruby), but it's more non-Microsoft stuff to install on a new computer.
I started using powershell about two years ago. At first, I used it simply as a replacement for the old Windows command shell. As I learned more of the language, I found myself using it more and more with my daily tasks. It was very easy to get started and to increase my skills as time went on and I discovered how to exploit powershell's features. In short order, I was writting scripts. Then I started using WMI, then COM, then .net objects. It's easy to get started and just go along at your own rate.
I'm not going to spend much time cheerleading for powershell. In fact here are the bullet points for my whole speech:
I believe that powershell is the best thing Microsoft has done in a long while.
I think that if you are a SQL Server admin or any sort of network or Active Domain administrator, and you have any tendencies towards scripting (as opposed to point-and-click use of GUI-based admin tools), you should be at least competent in powershell.
If you are working in Microsoft's universe, fighting Microsoft's tide is usually unsuccessful.
Microsoft seems to be trying its' damndest to get you to use powershell.
I am not going to try to teach powershell basics because there are plenty of good resources online. I don't own any powershell books, so I can't really make any suggestions. I'm used to teaching myself things, often under pressure, but a book or books might work well for you.
I am not going to show you lots of gee-whiz syntax or how to write terse and incomprehensible one-liners because I don't see much use in that, as far as trying to teach goes. It's fun as an intellectual exercise. It's pretty easy to turn everything into a one-liner, if you hide it in a function.
What am I going to do? A few examples:
I have a script that can report information on storage (sizes, free space, percent free, etc.) quickly, from a command line. With some powershell trickery, I can easily run this report on one server or dozens. The great thing about this is that the output from each server is coalesced into a single collection of queryable objects. I can dump the output to the screen, or stick it in a Excel file or a table.
I have a script that can run an ad-hoc SQL query on any instance or group of instances I choose. Similarly, I can coalesce the report into a single collection. This script can be fed instance names through a pipeline, so they can be stored in any form of permanent storage.
I intend to split my postings between functions in my profile and scripts that do more complicated things. I also intend to keep the length of posts down to something shorter than this one.
Subscribe to:
Posts (Atom)