Posts

Showing posts from 2008

Finding Partition Offsets Made Easier (with Powershell)

A while back, I read Linchi Shea's blog posting on finding partition offsets . I recently found some time to rework his C# file plus perl script solution into a single Powershell script. The upsides to my solution are: It's Powershell-only, so there is nothing to compile and you don't need to install perl. Using Powershell provides some added bling, for no real cost. It's easy to construct a pipeline that will interrogate many servers or will provide sorting or filtering. The major downside to my solution is: You must install Powershell. This shouldn't be a problem on Windows Vista or Windows Server 2008. Please be careful of the formatting. While embedding the code here is more convenient than a zip file, I'm still learning blogger and things may not wrap the way that I would like them too. I've also looked for a way to post syntax-highlighted Powershell code. There are solutions for other languages, but nothing for Powershell is jumping out at me so I will...

I'm attending the upcoming Philly Code Camp

A bit of a plug for the local user groups: The next Philly Code Camp  will be held on October 11, 2008 at DeVry in Fort Washington.  I don't make it to the regular Philly SQL Server User Group meetings anymore because I'm working downtown and the drive to Malvern is painful, but I do try to make it to the Code Camps. I always learn something. I tend to bounce from track to track, depending on my interest level in a particular topic and how crowded the room is, but I'm always in line for lunch. ;-)

Back from the dead

I haven't had much going on lately, at least not anything of interest to anyone out there. It's been a long time between posts, but here are some new posts, albeit less technical than my earlier postings.

"Data Dude" Licensing changed; better development environments for all

One way or another, I've been fighting the lack of integration between "backend" and middle-tier (or client-tier) development environments for decades now.  About a year ago, I was fairly thrilled with the introduction of the "Database Edition" of Visual Studio, A/K/A "Data Dude". I was immediately disappointed by the restrictive licensing and the cost. I've been using Data Dude tools for nearly a year. I think that they can be an abolute boon for developers of new applications or ones that are in "maintenance mode". Unfortunately, any discussion of Data Dude always involves it's cost.   It often seems like the cost of everything in the computing universe drops except for bandwidth and Microsoft products. It wasn't always this way.   Some of my earliest professional IT work involved writing programs with QuickBasic 3.0 (and, later, with Microsoft BASIC using the "Professional Development System" A/K/A "PDS Version 7...

Legacy Connectivity in a 64 bit world, Part 5 of 5

And now, the final boot in the teeth... As I detailed in my last post, it was decided to segregate access to Visual FoxPro data by running OPENQUERY statements on a 32 bit SQL Server 2005 instance. I did all of my initial tests on my laptop, using a local SQL Server instance to access the FoxPro files stored on network file shares. Before I deployed this to a server, my initial thinking was that this wouldn't be too bad once we put support for delegation into place. Aside: For the uninitiated, delegation allows a SQL Server to use the Active Directory credentials (AKA "domain logins") of the currently-connected user to perform tasks. Essentially, you need to create an Active Directory object called a Service Principal Name or "SPN". This is not hard to set up. For me, the bigger chore was getting over the heebie-jeebies caused by the idea of having a service impersonate users. In our case, one server would be doing remote queries to another SQL Server. By putti...

Legacy Connectivity in a 64 bit world, part 4 of 5

This post continues my five part saga on connecting to legacy data formats. This third data source type (of four types) is Visual FoxPro . I haven't seen very much Fox since 1992, when I was a C programmer and worked with dBase, FoxPro and Clipper for a few months. I thought that everyone left FoxPro and dBase behind in the mid-1990's. Not so. Visual FoxPro was dead-ended by Microsoft a while back. Microsoft states that the ODBC drivers are depricated and that the 32 bit OLEDB drivers should be used. I can pretty reliably bring down an instance with the ODBC drivers and I am very interested in using something besides these drivers. There are 32 bit OLEDB drivers freely available from Microsoft but it looks like there won't ever be any 64 bit drivers. All of the projects which I found on Sourceforge looked too shaky to consider running "for real", and I'm not in a position to start writing or supporting open-source OLEDB drivers. No third-party has stepped...

Legacy Connectivity in a 64 bit world, Part 3 of 5

The first hint of real trouble came when we started looking at the existing DB/2 connectivity. The second data source type (of four types) is DB/2. Currently, the application uses 32 bit ODBC drivers dating from the late 1990s. They were written by Wall Data, which was acquired by NetManage in 1999. NetManage's web site doesn't seem to acknowledge that these drivers existed or that NetManage supplies any sort of database connectivity software. One hint as to the age of the drivers: The drivers refer to AS/400, which was renamed to " iSeries " about, oh, forever ago. It turns out that there are a number of 64 bit OLEDB drivers for DB/2 available, including Microsoft's own DB/2 driver, which used to be part of it's "Host Integration Server 2004" product, but is now freely downloadable . IBM's got drivers too. My initial problem wasn't finding drivers, it was getting the linked server to talk to the DB2 instance. No one involved understood...

Legacy Connectivity in a 64 bit world, Part 2 of 5

To continue from the last posting, a project that I am associated with needs to deal with four non- SQL Server data sources. I'll start with the easy one first... The first data source type is text files. We can get the data in these files into SQL Server with BULK INSERT (which seems deprecated in SQL Server 2005, but seems to still be in SQL Server 2008) or OPENQUERY using the bulk option. This is all native to SQL Server, so I'm not counting this as a major issue. Exporting to text is a problem, but there are only a few very simple examples of that in the codebase . We are trying to avoid use of xp _ cmdshell for all of the usual reasons. For most of our issues, we can probably use SSIS or even a custom vbscript in a job. The worst-case scenario is a small CLR procedure. (Why-o-why doesn't SQL Server have a BULK EXPORT?) I'd prefer to keep these OPENQUERY statements in place because they are mostly in views and procedures stored in user databases. It's...

Legacy Connectivity in a 64 bit world, Part 1 of 5

One of my projects involves moving some very old applications running against a 32 bit SQL Server 2000 instance to a 64 bit SQL Server 2005 instance. When I the project was initiated, I thought my largest problems would be: Differences in behavior between 2000 and 2005 Porting a couple of hundred DTS packages to SSIS Wrong. It turns out that my biggest problem is a lack of 64 bit OLEDB drivers. I have had reason to spend a good deal of time researching the availability of 64 bit solutions for certain legacy data formats. This is a summary of that research and I hope that it saves someone some pain. Anyone could do what I have done here by spending time searching with google. I don't know anything unique, nor do I have any insider knowledge. Still, I was surprised by the situation on the ground. An early version of this originally appeared on Swynk email lists , specifically the sql 2k5 list. It's been edited to tighten up some of my informal writing style. I have also ...

Powershell functions: whoami and Prompt

Two quick ones for this post, before the US 4 th of July holiday. Both functions have one thing in common: use of environment variables. In order to see the environment variables that are available, just type: dir env : (This listing could be extensive. ( dir env :).Count returns 51 on my laptop.) Environment variables aren't as popular as they used to be, with settings located in registry entries, . ini files and . xml files. For a few things, it's still easier to find their values through an environment variable than to figure out where they hide in the registry or google for the proper .net calls. When I set up a new machine, I add a path to the environment that points to my local copy of my scripts. This can be very handy since I still have some . vbs and . wsf scripts lurking around. whoami simply displays the name of the currently logged in user. It's very simple, but it is also a good, simple example of how to use environment variables in a script. Here i...

Poweshell function: Get-Uptime

For this posting, I present an uptime command. This is the sort of thing that most scripters wind up writing, I suppose. For me, the neat thing was how just a few lines of script produced a program that was more flexible and usable than an old, compiled EXE program that I had been carting with me around on a USB stick for years. A little bit of this code is a refactoring of code posted at BSOnPosh , but that page seems to have been taken down during a site re-arrangement and my search didn't find anything. IIRC, I wanted an easier way to get uptime information for more than one server, so I refactored some of his code (the WMIDate conversion, which I wanted to be able to re-use for other things, is sort of fiddly) and added my own code to loop over the pipe line input, to produce the TextDescription and to make the select-object code more readable by using the $fields variable to hold all of the necessary script blocks. Anyway, I present what I've got for your perusal. Persona...

A (simplistic) implementation of a watch command

Today's posting covers a function which mimics the 'watch' command, which is popular on Unix and Linux systems. The watch command will repetitively run a command line and display the output. This is useful for watch the progress of systems writing a series of files or waiting for a server to come online. Just start a shell, run the command and park the shell so it is still visible, but out of the way. Even though my function does not do a particularly thorough job mimicking the Unix watch command, I still get a lot of mileage out of it. This one is so simple that I am just going to paste it in here and hope that the blogging software doesn't drop the formatting. I'm still learning the quirks of blogger, so please bear with me. Here goes: ############################################################################## # Purpose: More-or-less mimic the popular unix "watch" command function Watch-Command { param ( [string] $command, [int] $i...

Powershell Function: Get-LogicalDisk

In this post, I will discuss Get-LogicalDisk . Get-LogicalDisk is a function that retrieves disk information from remote servers via WMI. This allows for quick and flexible reporting on storage usage and availibility. This is a bit of a departure from "SQL Server"-oriented functionality that I have been describing in recent posts. In some shops, free disk space or, more likely, the lack of it is a daily battle. Every morning, I run a script that checks for many different kinds of "negative events" that might have happened over night. A "negative event" might be a failed job, database growth or a server running low on disk space. The latter check calls Get-SQLServerList and Get-LogicalDisk to do all of the heavy work. The script is straightforward, so I'll just continue with my tradition of a brief overview with some examples. (This is a blog, after all, and not really suited to voluminous detail. FWIW, it is my opinion that detail on what a confusing p...
In my last post, I introduced Get-SQLServerList. In this post, I am going to describe some more ways to leverage the Get-SQLServerList and Invoke-SQLQuery functions. Basically, I am suggesting that you follow this formula: Step 1. Put your server names in a text file called "sqlserver-list.txt" and put the file in the root of your C drive. Step 2. dot-source the functions that I've given you. Step 3. Profit. (I'd like to remind you that I normally alias Get-SQLServerList as "ssl" and Invoke-SQLQuery as "iq". This is to save on typing. It's also worth pointing out that I normally use the "ft" alias instead of typing format-table. On with the show...) Let's say that your boss wants to know which 64 bit servers aren't at SP2. One way to do this would be to spend a lot of time manually connecting to each system, running a query and putting the results into a spreadsheet. Another way to do this is to go to the shared Word or Excel ...

Powershell Function: Get-SQLServerList

In my last post, I described Invoke-SQLQuery. In this post, I describe Get-SQLServerList, which retrieves a list of SQL Servers. Get-SQLServerList leverages Invoke-SQLQuery and provides an easy way to provide server names to a pipeline. This time, the zip file for Invoke-SQLQuery contains two functions. The first function is Query-ServerCatalog. This function builds a simple SELECT statement and then uses Invoke-SQL Query to run it. This is useful as an example of the sort of thing that you can do with Invoke-SQLQuery. Query-ServerCatalog will return a result set full of servernames, assuming that you have a particular table in a particular database on a particular server that holds these names. Setting that up and keeping it current is a bit of work and probably is not worth doing unless you work in a large environment with a group of DBAs. The more interesting function is Get-SQLServerList. Some of the highlights of Get-SQLServerList are: It reads a text file to find the names of se...

Powershell Function: Invoke-SQLQuery and Ping-Computer

In this posting, I am going to (briefly) discuss Invoke-SQLQuery , which will run a SQL Server query against a computer or a group of computers and then return the results as a Powershell object. This object can be fed to another element in a pipeline, where it can be sliced, diced and stored using Powershell's intrinsic capabilities. About a year ago, I was looking for a way to return a result set to Powershell. It seemed that I was forever logging into a SQL Server instance, running one query, then logging out and repeating the whole thing on a different instance. One day, I read a posting on Powershell and SMO by Muthusamy Anantha Kumar. I noticed some ado.net code towards the bottom of the page and I same the proverbial light: Powershell is to .Net objects what VBScript is to COM objects. Armed with that, I started working on a function that ultimately evolved into Invoke-SQLQuery. I'd like to point out the following facets of Invoke-SQLQuery: This function directly accept...

Powershell Function: Change-VolumeLabel

This posting continues with my series of articles that describes my Powershell environment by describing a function that I use to change volume labels on remote machines. A file with the function is available . I wrote this function because I needed a way to change labels associated with volumes on remote machines. Earlier versions of this function were actually written in VBScript. I rewrote that old code in Powershell because I wanted more consistency with the "standard" Powershell command line switches and I am (slowly) moving away from all of my old VBScript programs. Before I had a script, I would use RDP to connect to the remote machine, login and then use Windows Explorer to change the volume labels. This was a lot of effort for a small change. I use the volume label at one of my clients as a quick way to discern between SAN and non-SAN volumes by prepending "san" or "local" on the volume labels. When this convention was adopted, there were a fair n...

Parsing SQLIO output with Powershell

Anyone who has used sqlio knows that it's output files are not particularly readable. I used to manually run sqlio tests, then retype the output into a spreadsheet so that I could more easily analyze the data. This proved time-consuming, error-prone and boring, so I wrote a script that parses all of the data that I wanted from the sqlio output file. This post provides that code, and some background on how I use it. I will return to my series on my Powershell set up next time. The first thing to do is to run your tests with sqlio and get it's results into files. This is left as an excercise to the reader. I have a .cmd file that I use to do this. I use a .cmd file because most of the machines that I test do not have Powershell installed on them. I am including this file in the SQLIO.scripts.zip file. The reason that I break the running of the tests and the parsing of the output into two seperate processes is to decouple them. It can take quite a while for sqlio to run, especial...

My Powershell set up, Part 2 of 2

Last time, I explained that my profile consists of several files, with various functions (and associated aliases) broken out by functionality. Each of these "subprofiles" is loaded when my standard Powershell profile runs. . "$env:winbin\psh\profile.Security.ps1" . "$env:winbin\psh\profile.RemoteService.ps1" . "$env:winbin\psh\profile.SQLServer.ps1" . "$env:winbin\psh\profile.dstrait.ps1" . "$env:winbin\psh\profile.$($env:userdomain).ps1" The last line will load a client-specific file, depending on the credentials that I am currently using. Some of the functions in these files are mine and some of them were borrowed from various web sites. When I borrow a function, I add a small comment header that describes where the function came from. This would allow me to go back and look for more interesting scripts from that author, or to look for improved versions or allow me to give credit where credit is due. Some of these functions ...

My Powershell set up, Part 1 of 2

This posting gives an overview how I have set up Powershell on my workstation. Any heavy shell user normally customizes his (or her) environment. These environments are (and will likely always be) a work in progress. As you learn things and write additional functions to help you with your tasks, your environment will grow and change. I find that I am tweaking things even as a I write this posting. Aside: One of the things that I am forever tweaking is function names. I try to follow the Powershell verb-noun convention, but there are lots of spots where I have had to make judgment calls. It's not hard to find inconsistencies. For example, I vacillate between "Report", "List" and "Get". I'm not crazy about Get. My issue with Get is that it seems that Get should return an object and be used like this: $result = Get-Thingy $result.MemberFunction $param $result = $null whereas I run a lot of stuff that just returns information. A 'Report' is usu...

First Post

So, belatedly, I become a blogger. It's not that I don't like to write. I like to write very much. Too much, really. Anyone who has been the victim of one of my emails or read one of my postings on the old Swynk SQL DBA list can vouch for that. The primary thing that has always held me back from blogging is a simple question: "What can I say about SQL Server that others have not already said?". There are a wealth of great blogs and articles discussing SQL Server and there is little point in piling on. The one thing that I know better than any other DBA that I have come across is Powershell. (It's true that I don't get out much.) I have had a few people comment on how quickly I can slap together reports, push changes to databases and do other tasks. A key reason that I can do this is Powershell. So, we'll push on with powershell for a while and see how the page hits work out. There may be a few side trips along the way, as I have a few useful scripts writt...