Tuesday, April 1, 2014

What are DACPACs and how do I use them?

I had someone ask me about DACPACs recently.

DACPAC technology had fallen off of my radar after I had seen demos of the feature at a SQL Saturday many years ago.

In short, a DACPAC is a file that contains definitions for the objects of a database. This file can be used to create new databases or update old databases to a new version. For the most part, existing data in the updated database should be preserved. DACPAC technology is intended to replace the bundles of .SQL scripts and the giant "Hail, Mary" scripts that are often used to update databases.

At the time of that SQL Saturday demo, which was probably in the SQL Server 2008 timeframe, DACPAC technology was new and there were a lot of gotchas. IIRC, it was presented as a way to create databases in SQ:L Azure.

A few weeks ago, I noticed that SSDT was creating DACPAC files. I've long been a user of the database comparison tool provided by SSDT and other "Data Dude" descendants, but I didn't give the "freebie" DACPAC file that SSDT generated for me a second thought.

Since someone asked, I thought that I would spend some time researching this and write it up for the blog.

Things in the DACPAC universe seem to have improved substantially, and I found the following links useful:

According to the wiki, the DACPAC scheme still makes a copy of the target database and subsequently deletes the old one. That might be a showstopper for many large databases and was the main reason that I banished DACPAC to the back of my brain. However, I don't see many complaints on the web about this and that made me suspicious. In some very limited testing of a single-table database with four columns, I did not see any creation of mysterious databases. Perhaps the situation has changed with SQL Server 2012 and the wiki is out of date?

(While performing my limited tests, I noticed that a certain amount of downgrade-ability might be feasible. By simply applying the 1.0.0.0 DACPAC to my 1.0.0.1 database, I could remove a column. I'm not sure how I can exploit this, or if it would work on a non-trivial database, but this is the sort of think I like to keep in the back of my head as a possible future trick.)

In short, DACPAC seems to have matured into a viable deployment strategy. I intend to look for situations where I can use this technology to improve the speed and quality of deploying new database versions.


Saturday, January 4, 2014

SugarSync Is No Longer Quite So Sweet

Over the last few years, I've spent more time following business issues in the IT industry than I had when I was younger. Perhaps it's just another mark of getting older.

Over the holidays, a news item caught my eye: SugarSync has stopped giving away free storage.

It's the hallmark of any new technology or developing area of business:

  • Someone comes up with something new. 
  • Many other players pile into the new space.
  • Some time passes
  • Players that can't make money (or generate the growth numbers they want) leave the new field and the space consolidates.
  • The consolidation can go down to just a few companies. Those companies do not always include the one that created the space.


We have seen this with cars (in the first part of the 20th century, there were hundreds of small manufacturers in the US alone. now there are only two behemoths that are headquartered here and a few dozen firms based overseas). Other technologies (radio, television, ISPs, RDBMs, hard drives, and personal computers to name a few) and business areas (department stores, malls, convenience stores, hardware stores) have gone through the same evolution.

In the case of consumer-oriented, internet-based, replicated storage, I believe that DropBox was first and SugarSync followed. (I'm ignoring things like rcp, unison and such. Even though they have been available for decades in some cases, they never caught on past the sysadmin or power *nix user crowd.) With SugarSync changing it's business model, I can't help but wonder if we are seeing more evidence in "the internet" turning away from the free-for-all model (so to speak) to the same old charge-'em-at-the-door model that we've seen for generations. Periods of consolidation can presage increased profitability for the survivors.

A few years ago, I was an avid SugarSync user. I used their Windows and Android clients. I chose SugarSync over DropBox, Box, Skydrive and Mesh because SugarSync had an Android client and a the most liberal policy with respect to free space. IOW, they gave away more GB than their competitors.

At some point, the Android client stopped syncing my photos properly. At that time, both Google Drive and Microsoft Skydrive became viable alternatives. I need Google and Microsoft for other reasons, so SugarSync was the odd man out. I retired my SugarSync clients.

With industry heavyweights giving away storage space, SugarSync has a hard row to hoe. Microsoft and Google can afford to give away space, as a sort of loss leader. In the long run, I think that the online editing provided by Microsoft and Google will become increasing attractive to users. DropBox has name recognition and is widely supported by popular apps. Box has the enterprise orientation that DropBox doesn't (at least not yet). I could see a larger company buying up DropBox or Box. I am not sure where SugarSync fits into the market, five years down the road.

I would say that the best path for any of these small firms would be an acquisition by a larger player that already provides some sort of SaaS and supports multiple platforms. This leaves Apple and Microsoft out, but a consumer-focused organisation like Yahoo might do nicely. Amazon might find it easy to integrate SugarSync into their S3 storage offering and it might be worth something to them if the costs are right.


Tuesday, September 10, 2013

Entering the Brave New World of Google+

It's been a long time since I've done anything interesting with the blog, and it seems a little stale.

With some prompting from Blogger's "Buzz" blog, I'm hooking this blog up to a Google+ account. This is a bit of a brave new world for me, since I've confined myself to LinkedIn, Twitter and a few other sites. 

At the least, I hope that this will help comments on the blog be more dynamic and that a few more people might find these posts. Perhaps we can get a few more conversations going as well.

Friday, September 6, 2013

A Short Tale of Troubleshooting a PS/2 keyboard and a "Legacy Free" System

I recently found an IBM Model M keyboard at my local Goodwill store. There is rarely any computer hardware in there worth having, so this was quite a surprise.
As far as I can tell, this Model M has never been used even though it was made in 1993. It came in what appears to be the original box, with a styrofoam insert. The coiled, un-stretched cable still has plastic wrap on it. There is no visible wear on the keys. There are no finger grease stains. It is a virgin, IBM-branded, Lexmark-manufactured beige beauty. Even the box doesn't look 20 years old.


The model M was built before USB was a thing, so it has a PS/2 port. My main rig has a "legacy free" motherboard, so it doesn't have PS/2 ports. It only has USB ports.
The keyboard I have been using lately is from long-gone Micron computer. I bought a computer from Micron in the 1990s and the only remaining evidence of that is that keyboard. (Micron was a popular computer vendor at that time, though there isn't much evidence of that anymore.) Normally, I use a PS/2 to USB convertor to connect my Micron keyboard to my computer. That has worked great for several years.

When I unplugged the old Micron and plugged in my new Model M, they keyboard was unresponsive. No lights, no nothing. Worse, the mouse that is plugged into the same PS/2->USB converter also stopped working.

In the spirit of the IT Crowd, I turned it off and turned it on again. No change in behavior. I plugged my old keyboard back into the computer. Everything works. This wasn't the best way to start the week.

I thought that my siren-like keyboard was dead, but I carried on and kept experimenting. It turns out that if I plug the Model M into the PS/2 port on my Thinkpad dock, the keyboard works.
The PS/2->USB converter I was using is an Adesso Ez-PU21 "Smart Adaptor". It's got a snazzy logo, a serial number and some other information written on it. I've been using it for so long I don't even remember where I got it from.

While researching the problem, I found a detailed explanation of how the PS/2 interface works. It has links to a history lesson on keyboards and detailed information on the communications protocols used for keyboads and mice, keyboard scan codes and more. There is a more approachable article describing the PS/2 interface on Wikipedia.

The new Model M has four pins in it's connector, my Micron PS/2 keyboard has six pins and so does my mouse. I have another, older and grungier Model M that also has six pins. The two pins that are missing are shown as "not implemented" on all of the PS/2 connector diagrams that I can find. The two "extra" pins were sometimes used to implement non-standard mouse-and-keyboard combination cables. Those missing pins shouldn't make a difference, yet they do.

I dug out the other, no-name, beige-grey converter that I own. I had thrown into the bottom of my parts pile years ago, last using it with my Pentium III/933 desktop. There is no writing on it other than "Made in China" and the date it passed Quality Control in June, 2006. I've got no ideal who made it. It works. No problem. 

Once again, persistence wins over genius. I've got a great, "new" keyboard from 1993. 

Tuesday, April 9, 2013

What do I do to stay involved in IT?


In my opinion, one of the things that separates a "professional" from a "worker" is a demonstrated desire in learning about IT. The people who excel are always the ones who are reading about new developments, techniques and strategies.

In my case, I've always enjoyed learning about computers. I could keep my finger on the pulse of most of the IT universe 30 years ago. There was much less to keep track of and things moved much more slowly. As time has gone on, keeping track has become harder and harder to do. I still retain the simple desire to learn, but I don't always choose the most lucrative things to learn about. I'd estimate that 75% of the things that I've ever learned have not been put to good use.

In the 21st century, it's impossible to really "keep up" with IT. There is such a breadth of systems, applications, tools, languages, environments, hardware and software being created and maintained by millions of people that you literally can't know it all. Just keeping up with a particular area of concentration can be hard enough, and with the demands of "real life", I have come to believe that it's just a question of how quickly (or slowly) you fall behind on most of it.

I still read a number of blog articles. I follow about 25 blogs that are specific to SQL Server and about 170 sites that are IT related in some way. In the mid-oughties, IT blogs were a staple of my daily diet of news and information. I'd probably spend 1 to 2 hours a day reading posts. It seems that people are updating them less and less frequently and I find that there is a lot of rehashing of introductory subjects. (This is particularly true of things posted to LinkedIn.)

Since Google has announced the pending shutdown of it's Reader application, I've switched to Feedly. I'm  less efficient with Feedly than I was with Reader and Bloglines seems worse.The number of things I'm reading is down. In short, it feels like this area is dying out. I've decided to take this as an opportunity to find new sources of information.

I spend a lot of time on The Enterprise Cloud Site. There is a lot of comment there on how to best use "cloud" technologies (or perhaps how to avoid being run over by them). The site has a particular focus, and that focus requires me to stretch beyond my core competencies  The exciting part of cloud technology is that it potentially allows SMBs access to technologies that they would not otherwise be able to afford. The not-so-exciting part is that cloud technology puts pressure on the careers of administrators.

I listen to podcasts. Nothing really new here, podcasts are a great way to fill dead time. The interactive questioning between a host (or hosts) and a guest speaker allow podcasts to flesh out a topic in ways that a blog posting cannot. My favorites are:

  • The "SQL Down Under" podcast, which you can find here, discusses a wide range of SQL Server issues. It tends to run a bit over an hour.
  • The "RunAs Radio" podcast, which you can find here, focuses on a wide range of Windows administration issues. In addition to episodes dealing directly with SQL Server, there are also very interesting episodes discussing the virtualization of domain controllers, problems with passwords, hadoop on Azure, SharePoint, clusters, security issues and so on. These shows are about 30 minutes long and it is easy to listen to them even if they aren't 100% in your area of expertise.
  • The ".Net Rocks" podcast, which you can find here and focuses on primarily on Windows development. Listening to IT podcasts that are out of one's particular area of concentration can be useful. It's always good to see something through a different perspective. It's good to know what the programmers are going to be interested in and how they might want to use (or abuse) the servers under a DBA's care. (LINQ is the classic example.) The .Net podcasts can get into into some fairly arcane programming issues, so sometimes I'll cut them short or just skip them. They do pump out a good number of episodes, so it can still be hard to keep up with them.

All of these podcasts are available for free through the iTunes store.

I'm still (very, very slowly) working my way through sessions from TechEd North America 2012. You can find those at this RSS feed, via Channel 9. There are many presentations, often given by people involved in the development of the product, feature or service. These presentations can get into very arcane things that are entirely out of my scope. Even when they are within my scope, they may describe situations that I will never see. I have found a few gems  in there, like a presentation on the new features in Windows Server 2012 file sharing.

It's sort of old-fashioned, but I attend my local SQL Server User's Group. In my case, that is PSSUG. I will be teaching a class for the next few months. This leads to a scheduling conflict with the regular PSSUG meetings. I plan to switch over to the Philly Business Intelligence User's Group because it meets on a different night.

In the future, I'm looking towards refocusing towards more formal learning in the form of MOOCs and, after 15 years of database administration and 25 years of IT experience, I might actually get around to taking those MCSE exams.

Thursday, March 14, 2013

What to do about Google terminating Reader?

So, at about 1 am this morning, I found out that Google is shutting down it's widely-used Reader application. I read four different takes on it (I think I saw it on BGR first) and only then headed off to Reddit. (It's all over the Internet now. Apparently, other writers recovered from the shock more quickly than I did.)
 
Ironically, I found out that Google was smothering Reader while using Reader. 
 
This is particularly frustrating to me. I use Reader more than any other single thing. Reader channels probably 95% of my Internet interaction. I use it for a few hours every day. I use it at my desk. I use it during commercial breaks on TV. I use it when my wife falls asleep on the TV couch and when I can't sleep. I use it while waiting on line at McDonald's. I use it on commuter trains and on long car trips. It's a constant companion, I've spent more time with it than any pet and nearly any human. If my phone is charged and I'm not doing something else, I'm reading articles. Between my two Google accounts, I subscribe to about 225 feeds and I've read more than 315,000 items since the summer of 2010. I had so many feeds that I moved my "business stuff", which I had organically/non-strategically added to my personal account, off of my personal account and onto my business account a couple of years ago as an organizational tactic.
 
I've been using RSS feeds for so long, I'm not sure when I started. Eight to ten years ago, before I started using RSS feeds, I used several folders of bookmarked websites and went through them manually. I'd have to look at the website and remember what I had read and what I had not. Some folders were checked daily, others were checked weekly or monthly. This became unwieldy, as lots of sites do not update regularly and I spent more time scanning for things to read than actually reading. At some point, I made a strategic decision to stop reading web sites that had no RSS feed even thought they often had valuable content. (I'm looking at you, Storage Review.)
 
To read feeds, I first used Akregator when I used KDE/Linux for my main rig. After moving away from Linux, for a short time I tried Windows fat clients. That stopped when I wasn't allowed to plug my laptop into my client's corporate networks. I needed something I could access with a large number of machines and did not require an install, so that I could read during lunch and on breaks. Obviously, that called for a web site. I tried iGoogle and Bloglines. Neither never really clicked with me and I gave them up when I moved to Google Reader. I never even looked back. When I started using smartphones, the Reader web site was among the first that I bookmarked. When I moved to Android from S60, the Reader app was one of the first apps I installed.
 
This is a great opportunity for Yahoo, Bing, BlogLines and any other vendor. (Microsoft has kilotons of programming talent and megatons of infrastructure. They should be able to write and put up a replacement for Reader in a few weeks. They'd probably make the mistake of somehow making it Windows 8-only.) Google is making a mistake by abandoning a piece of the chess board. They were the de facto standard. Little "Add this to Google Reader" buttons are all over the Internet. They will be disappearing. Now someone else is going to get those users and those eyeballs. The upside is that we might get some innovation in the RSS space. First, we will simply have to replace what we are loosing.
 
I presume that this is somehow related to Google's push to remake themselves as "Google Plus". Reader was just about the only reason I ever posted anything to G+. When I realized it was easier to share things via email than G+ and there was no way to monetize things I sent to G+, I stopped posting to G+. Now I will have absolutely no reason to go anywhere near G+. 
Why kill Reader when G+ is a ghost town? Surely, Reader can't consume very much in the way of resources. Why not kill Orkut? If Google is "tightening up" it's offerings, why do I see so much hype around driverless cars and those X Ray glasses? Where is the revenue generation for those services? They are clearly spending tens of millions of dollars (or more) on these ever-forthcoming products. Does Google expect me to stare at ads while I walk down the street while wearing their glasses or drive down the road with their cameras on the top of my car? Does Google expect everyone to act as roaming data collectors for their databases?
In the post announcing the shutdown of Reader, Google made reference to the usage of Reader dropping. Every change they made to Reader in the last three years made the service worse, not better. If usage has dropped, Google has only themselves to blame. 
 
It's as if Microsoft and Google have entered into a contest to see who can make the biggest mistakes.
 
I'm often asked if anyone can rely on SaaS. Not if it is a free service, clearly. If such a popular service is dropped with little or no warning, less popular services must be even more prone to evaporation. Things are likely to get worse as the behemoths try to put up walls around their gardens.
 
(As a semi-related side note, when did Reader stop supporting OPML export? They support OPML import and OPML is the Lingua Franca of RSS aggregators. Google likes to brag about letting you take your data with you, but any data professional will know that the format of the data is almost as important as the data itself. If you can't import your data into another system, it's useless until you write a conversion tool. The careers of many programmers have been built on writing such tools. The vast majority of Reader users will not be able to write such a tool and they will have to rely on the kindness of strangers. I suspect that many of them will migrate their feeds manually, with many feeds being left behind and many blogs seeing a drop in pageviews.)
I am reminded of the del.icio.us fiasco from serveral years ago. That service seemed to come from nowhere and was wildly popular for a time. Yahoo bought it, screwed it up, and I (and many, many others) stopped using it. When Yahoo said that they were going to shut del.icio.us down, I migrated my data into Evernote and never looked back. Looking at it now, it seems that del.icio.us somehow got out from under Yahoo and is/was trying to rebuild what they had. Of course, they can't. There are elements of the Digg implosion in here, too. They had something good, they felt compelled to change it, their users found the changes were for the worse and they left. Surely, the Googlers are familiar with del.icio.us and Digg.
 
Looking at iGoogle now, I see that it is also marked for death and has already been eviscerated, so I won't be going back to that. I was surprised to find that Bloglines is still around. I've already moved a few feeds into BlogLines, but they don't seem to have any mobile app and I am still looking at this as an experiment. I may wind up with a few different solutions, each picked to follow certain types of feeds or to segregate "personal stuff" from "business stuff". For example, Shorpy's, a site for (large) historical photos, is better viewed on a desktop. CraigsList feeds often require a quick response, and I'd like to keep them on my phone.
 
My first steps towards migration are to cut down on the number of feeds I do have and to start trying other services. This morning, I tossed about 40% of my personal subscriptions. (Looking at CraigsList just makes me want to buy someone's old gear, anyway.) I am down to 31 subscriptions, from about 50. I'm not going to touch my 175+ "business" feeds yet. I'm starting by moving things to my re-invigorated Bloglines account that I don't need to read every day or really need to be viewed on a larger screen, like my The Big Picture and Shorpy's feeds. 

 


Wednesday, February 20, 2013

Why Programming Environments Matter, Gaming Edition


As I do some posting on another site, I've been running the BGR live stream of the Sony Playstation 4 launch on my second monitor and I've got something to say.

I'm not a "gamer". I stopped playing games back in 1999 when I found that I had lost yet another weekend to Quake and Tomb Raider. I spent innumerable hours playing Quake, TR, Doom (and many others -- the late 1990s were kind of slow for me, socially speaking) on my Micron Pentium II. Before that, Falcon 3.0 (and Commander Keen and Wolfenstein 3D) on my Northgate 386/33. In medieval times, Ultima on my Mac Plus and Adventure on my Atari 2600 occupied my time. I was never all that good, but the games absorbed my brain in a way that TV never did. I had to stop playing games because I didn't do much else in 1999, besides go to work.
 
Since then, I have watched developments in game hardware and software from the sidelines. I do have some observations about the situation with the forthcoming PS4 and XBOX 720 consoles and I'm interested in how the competition between Sony and Microsoft plays out
 
It's interesting to me that the much-hyped (back then) "Cell" and "Xenon" microprocessors (Sony and Microsoft, respectively) have been dumped in favor of x64. The older consoles both used parts derived from PowerPC architecture. When those consoles were released, there was a lot of gushing over "lifelike" animation and how powerful the chips were. (I'm sure that the stuff shown at Sony's launch today will look awesome. A year from now, what you see today will look weird. Ten years from now, it will look crappy. People's ability to detect animation seems to improve just as fast as the technology improves. Much like nuclear fusion and rocket boots, real-time, life-like animation is always a few years off into the future.) Additionally, it was alleged that both companies minimized piracy threats by using an unusual (meaning not x86) architecture. 
 
PowerPC chips were originally designed to be a scalable family of commodity, general-purpose parts. They were used in Macs, in various industrial situations and as on-board computers in vehicles. PowerPC didn't pan out and it isn't a viable option anymore. It hasn't been an option for serious computing since before Apple did away with them in 2006. Several years later, both high-end console companies are moving to x64. I presume that they have something up their wizardly (crypto) sleeves to clamp down on the pirates. 
 
(I do wonder why they haven't gone with ARM. Is it that 64 bit ARM processors won't be viable until later this year? Is ARM still not "quite there" with computing power? I keep hearing that games are GPU-limited and not CPU-limited, so why would it matter ifthe CPU is a little "light" on power? Is AMD hurting so badly that it needs to sell parts on a long-term contract to help guarantee it's survival? Couldn't Intel just open the gates and flood Sony and Microsoft with underclocked parts having the performance of earlier-generation 32nm i5 parts? Maybe some souped-up Atoms? Surely Intel need to use those fabs for something and it doesn't look like it will be chips for cell phones.)
 
With the switch in processor architectures, the obvious problem is that you won't be playing any of the games you already own on your PS4 (confirmed by Sony during the product launch) or XBOX720 (unless Microsoft comes up with something wizardly before the launch). That means that you will need to keep your old consoles around. Since everyone who wants a gaming console seems to have one, that means a lot of old consoles will be around for a while. Games are going to have to get a lot better before the new consoles start moving in impressive volume.
 
 
Enough meandering -- here's the real rub: 
 
One of Microsoft's superpowers is making things easy on developers. (Arguably, this is their best superpower.) Many of these developers will be coming from other areas of programming that Microsoft already dominates. They will be bringing their understanding of how to program the "Microsoft way" with them. Small learning curve. Easy. Fast-moving projects.
 
I have never heard anyone say that writing software for the PS3 is easy. It was hard to write for the PS3, it will be hard to port PS3 software to the PS4 (and take advantage of the PS4's improved feature set) and I expect that it will be hard to write software for the PS4 from scratch. Big learning curve. Hard. Slow-moving projects.
 
In short, Microsoft and Sony have pressed the reset button. The playing field is back to square zero and they are now in a drag race. Microsoft is much better at building development environments than Sony is.
 
The 6.4 billion dollar question is -- who will have more (and better) software, more quickly?

You know who I'm betting on.