Thursday, January 26, 2012

A new version of SQL Server is almost upon us and Microsoft is having an online "launch event" on March 7th. Go here to register.


Question: Why can't I use my LiveID to register?

Wednesday, January 25, 2012

"Did You Read It?" for late January, 2012


This blog post mentions some of the most interesting articles that I have read recently, with a little bit of my commentary on each article. 

I read this article on pricing in the virtualization market. Everyone I speak to about virtualization scoffs at Hyper-V when comparing it to VMWare. But, here's the thing: SQL Server took a healthy chunk of the RDMS space away from Oracle and DB/2 even though those products were arguably "better". Can Microsoft win a portion of the virtualization market with Hyper-V even though VMWare may have a better product? IT decisions seem to always boil down to price and familiarity, in some combination. Hyper-V is easy to get onto a Windows server and people have been grumbling about VMWare pricing lately.

I read this very well written article on Microsoft Word versus Evernote (and versus a bunch of other things) by a person who spends a lot of time writing. I'm not a guru-level genius with Word, but I first used Word 5.5 on a Macintosh back in the mid-1980s. I used to chuck all kinds of things into Word, basically anything I couldn't do in a plain text editor like CodeWright or vi. Ever since the ribbon, I've drifted away from using Word unless I have to. I would only use Word for a large project. For smaller things, a few pages here and there and for "Daily Journal" type stuff, I gravitate towards OneNote and Evernote. I do these blog postings mainly in EverNote, and sometimes directly in Blogger. I'm not concerned with complicated formatting or styles, so there is no worth to using Word.

I read this article on managing truth when it comes to non-IT people and the superstitious. Co-incidentally, I actually wrote "Don't yout trust us?" last week, in an email conversation about DBAs and obfuscation.

I read this TechNet article on different ways to create custom objects in PowerShell. Flexibility often breeds complexity and complexity breeds holy wars on the "best way" to do things. Most stuff (I'm looking at you, Mr. "Coding Standards") just falls to the personal preferences of the first person who worked on a particular project. I've never liked the Add-Member method, it seems bulky to me and I am not known for being terse. Going forward, I think that the hashtable tactics are cleanest. 

I have been using code like this for years, possibly since before PowerShell 1.0 was released and it is a hard habit to break:

$result = "" | Select-Object "File", "Computer"
$result.File = "foo.txt"
$result.Computer = "hal9000" 

That is very VBScript-like, I think, and betrays my scripting roots. I normally don't build a lot of custom objects and I'm not motivated enough to go back through my existing body of code and rework it, so I don't have a lot of opportunity to redress my problem.

This article, along with comments made at the PSSUG meetings in Malvern, PA, makes me feel like XEvents are the future. Hm. More and more stuff I know is obsolete. Good thing I've got a new workstation to build VMs with, now I just need to find the time to do that.



Monday, January 9, 2012

What's on the Internet, lately?

This is a short collection of articles that I have found particularly interesting in the last few weeks.

This article discusses benchmark scaling as the number of cores continues to grow on servers. 

This article discusses Etsy's growing pains. It states something that I've felt for a long, long time--"If you are doing something clever, it's probably wrong."

This (old) article discusses discipline in software development. It has me thinking that I should read McConnell's Code Complete again. I know I've boxed, moved and unboxed my copy several times in the last 10 years and it seems that there is a second edition.Tempus fugit.
 
This article discusses refactoring tables with no downtime. This isn't for every database. There is a good deal of work here (in terms of development, testing and DBA work) that you would not need to do in a 8x5 environment, when you can easily take the database offline for a while to do a migration or upgrade. This strategy is only for databases that must stay up 24x7. Another thing is that it seems that a team could only work on a small number of changes (in terms of schema) for any particular migration. (This goes hand-in-hand with one of the author's conclusions, which points out that limiting the features of the system to a small number of high-value features was important.) That works if you are agile and following a "release-early, release-often" strategy with users who expect to see steady but gradual changes. If your users expect to see a bunch of changes all at once, it might not work as well. You might have to be more fussy about when you surface new features to the users, or you might have to drift back to a more traditional waterfall model or you might be stuck taking outages to do migrations.
This article on using the "database project" features in Visual Studio. I have used this feature, with varying degrees of success, since Visual Studio 2005. 
I particularly like the "DeployLog table" idea. I used a similar but different tactic over a decade ago. In that case, deployment was done by running a series of TSQL scripts via a custom-built utility. The name of every script run via the utility was recorded in a table. Based on the file name of the script, the utility would not run the same script twice. There was a scheme that allowed a script to insist that certain scripts had been run as a kind of prerequisite. The utility was smart enough to deal with it all. During development and testing, there was a lot of checking back-and-forth for objects that differed from the 'official' code base. 
 
Looking back at all of that, it seems over-complicated but we did not have sophisticated build tools or version control like TFS, Subversion or git and we did not have good control over the (several) sites that would be running scripts.
 
I haven't been able to work in a CI environment, but most of the suggestions in the blog post seem to make sense. I am not so sure about the "Use 'Not in build'" suggestion. I can easily see how this would be valuable if you do not have version control in place, but you should have version control in place. Personnel should be able to see the history of the code by looking through the VCS and having a good grasp of how the code has been branched.