It's been a long time since I've done anything interesting with the blog, and it seems a little stale.
With some prompting from Blogger's "Buzz" blog, I'm hooking this blog up to a Google+ account. This is a bit of a brave new world for me, since I've confined myself to LinkedIn, Twitter and a few other sites.
At the least, I hope that this will help comments on the blog be more dynamic and that a few more people might find these posts. Perhaps we can get a few more conversations going as well.
Ashdar Partners specializes in Performance Engineering (and re-engineering) of Microsoft SQL Server database systems.
Tuesday, September 10, 2013
Friday, September 6, 2013
A Short Tale of Troubleshooting a PS/2 keyboard and a "Legacy Free" System
I recently found an IBM Model M keyboard at my local Goodwill store. There is rarely any computer hardware in there worth having, so this was quite a surprise.
As far as I can tell, this Model M has never been used even though it was made in 1993. It came in what appears to be the original box, with a styrofoam insert. The coiled, un-stretched cable still has plastic wrap on it. There is no visible wear on the keys. There are no finger grease stains. It is a virgin, IBM-branded, Lexmark-manufactured beige beauty. Even the box doesn't look 20 years old.
The model M was built before USB was a thing, so it has a PS/2 port. My main rig has a "legacy free" motherboard, so it doesn't have PS/2 ports. It only has USB ports.
The keyboard I have been using lately is from long-gone Micron computer. I bought a computer from Micron in the 1990s and the only remaining evidence of that is that keyboard. (Micron was a popular computer vendor at that time, though there isn't much evidence of that anymore.) Normally, I use a PS/2 to USB convertor to connect my Micron keyboard to my computer. That has worked great for several years.
When I unplugged the old Micron and plugged in my new Model M, they keyboard was unresponsive. No lights, no nothing. Worse, the mouse that is plugged into the same PS/2->USB converter also stopped working.
In the spirit of the IT Crowd, I turned it off and turned it on again. No change in behavior. I plugged my old keyboard back into the computer. Everything works. This wasn't the best way to start the week.
I thought that my siren-like keyboard was dead, but I carried on and kept experimenting. It turns out that if I plug the Model M into the PS/2 port on my Thinkpad dock, the keyboard works.
The PS/2->USB converter I was using is an Adesso Ez-PU21 "Smart Adaptor". It's got a snazzy logo, a serial number and some other information written on it. I've been using it for so long I don't even remember where I got it from.
While researching the problem, I found a detailed explanation of how the PS/2 interface works. It has links to a history lesson on keyboards and detailed information on the communications protocols used for keyboads and mice, keyboard scan codes and more. There is a more approachable article describing the PS/2 interface on Wikipedia.
The new Model M has four pins in it's connector, my Micron PS/2 keyboard has six pins and so does my mouse. I have another, older and grungier Model M that also has six pins. The two pins that are missing are shown as "not implemented" on all of the PS/2 connector diagrams that I can find. The two "extra" pins were sometimes used to implement non-standard mouse-and-keyboard combination cables. Those missing pins shouldn't make a difference, yet they do.
I dug out the other, no-name, beige-grey converter that I own. I had thrown into the bottom of my parts pile years ago, last using it with my Pentium III/933 desktop. There is no writing on it other than "Made in China" and the date it passed Quality Control in June, 2006. I've got no ideal who made it. It works. No problem.
Once again, persistence wins over genius. I've got a great, "new" keyboard from 1993.
As far as I can tell, this Model M has never been used even though it was made in 1993. It came in what appears to be the original box, with a styrofoam insert. The coiled, un-stretched cable still has plastic wrap on it. There is no visible wear on the keys. There are no finger grease stains. It is a virgin, IBM-branded, Lexmark-manufactured beige beauty. Even the box doesn't look 20 years old.
The model M was built before USB was a thing, so it has a PS/2 port. My main rig has a "legacy free" motherboard, so it doesn't have PS/2 ports. It only has USB ports.
The keyboard I have been using lately is from long-gone Micron computer. I bought a computer from Micron in the 1990s and the only remaining evidence of that is that keyboard. (Micron was a popular computer vendor at that time, though there isn't much evidence of that anymore.) Normally, I use a PS/2 to USB convertor to connect my Micron keyboard to my computer. That has worked great for several years.
When I unplugged the old Micron and plugged in my new Model M, they keyboard was unresponsive. No lights, no nothing. Worse, the mouse that is plugged into the same PS/2->USB converter also stopped working.
In the spirit of the IT Crowd, I turned it off and turned it on again. No change in behavior. I plugged my old keyboard back into the computer. Everything works. This wasn't the best way to start the week.
I thought that my siren-like keyboard was dead, but I carried on and kept experimenting. It turns out that if I plug the Model M into the PS/2 port on my Thinkpad dock, the keyboard works.
The PS/2->USB converter I was using is an Adesso Ez-PU21 "Smart Adaptor". It's got a snazzy logo, a serial number and some other information written on it. I've been using it for so long I don't even remember where I got it from.
While researching the problem, I found a detailed explanation of how the PS/2 interface works. It has links to a history lesson on keyboards and detailed information on the communications protocols used for keyboads and mice, keyboard scan codes and more. There is a more approachable article describing the PS/2 interface on Wikipedia.
The new Model M has four pins in it's connector, my Micron PS/2 keyboard has six pins and so does my mouse. I have another, older and grungier Model M that also has six pins. The two pins that are missing are shown as "not implemented" on all of the PS/2 connector diagrams that I can find. The two "extra" pins were sometimes used to implement non-standard mouse-and-keyboard combination cables. Those missing pins shouldn't make a difference, yet they do.
I dug out the other, no-name, beige-grey converter that I own. I had thrown into the bottom of my parts pile years ago, last using it with my Pentium III/933 desktop. There is no writing on it other than "Made in China" and the date it passed Quality Control in June, 2006. I've got no ideal who made it. It works. No problem.
Once again, persistence wins over genius. I've got a great, "new" keyboard from 1993.
Thursday, March 14, 2013
What to do about Google terminating Reader?
So, at about 1 am this morning, I found out that Google is shutting down it's widely-used Reader application. I read four different takes on it (I think I saw it on BGR first) and only then headed off to Reddit. (It's all over the Internet now. Apparently, other writers recovered from the shock more quickly than I did.)
Ironically, I found out that Google was smothering Reader while using Reader.
This is particularly frustrating to me. I use Reader more than any other single thing. Reader channels probably 95% of my Internet interaction. I use it for a few hours every day. I use it at my desk. I use it during commercial breaks on TV. I use it when my wife falls asleep on the TV couch and when I can't sleep. I use it while waiting on line at McDonald's. I use it on commuter trains and on long car trips. It's a constant companion, I've spent more time with it than any pet and nearly any human. If my phone is charged and I'm not doing something else, I'm reading articles. Between my two Google accounts, I subscribe to about 225 feeds and I've read more than 315,000 items since the summer of 2010. I had so many feeds that I moved my "business stuff", which I had organically/non-strategically added to my personal account, off of my personal account and onto my business account a couple of years ago as an organizational tactic.
I've been using RSS feeds for so long, I'm not sure when I started. Eight to ten years ago, before I started using RSS feeds, I used several folders of bookmarked websites and went through them manually. I'd have to look at the website and remember what I had read and what I had not. Some folders were checked daily, others were checked weekly or monthly. This became unwieldy, as lots of sites do not update regularly and I spent more time scanning for things to read than actually reading. At some point, I made a strategic decision to stop reading web sites that had no RSS feed even thought they often had valuable content. (I'm looking at you, Storage Review.)
To read feeds, I first used Akregator when I used KDE/Linux for my main rig. After moving away from Linux, for a short time I tried Windows fat clients. That stopped when I wasn't allowed to plug my laptop into my client's corporate networks. I needed something I could access with a large number of machines and did not require an install, so that I could read during lunch and on breaks. Obviously, that called for a web site. I tried iGoogle and Bloglines. Neither never really clicked with me and I gave them up when I moved to Google Reader. I never even looked back. When I started using smartphones, the Reader web site was among the first that I bookmarked. When I moved to Android from S60, the Reader app was one of the first apps I installed.
This is a great opportunity for Yahoo, Bing, BlogLines and any other vendor. (Microsoft has kilotons of programming talent and megatons of infrastructure. They should be able to write and put up a replacement for Reader in a few weeks. They'd probably make the mistake of somehow making it Windows 8-only.) Google is making a mistake by abandoning a piece of the chess board. They were the de facto standard. Little "Add this to Google Reader" buttons are all over the Internet. They will be disappearing. Now someone else is going to get those users and those eyeballs. The upside is that we might get some innovation in the RSS space. First, we will simply have to replace what we are loosing.
I presume that this is somehow related to Google's push to remake themselves as "Google Plus". Reader was just about the only reason I ever posted anything to G+. When I realized it was easier to share things via email than G+ and there was no way to monetize things I sent to G+, I stopped posting to G+. Now I will have absolutely no reason to go anywhere near G+.
Why kill Reader when G+ is a ghost town? Surely, Reader can't consume very much in the way of resources. Why not kill Orkut? If Google is "tightening up" it's offerings, why do I see so much hype around driverless cars and those X Ray glasses? Where is the revenue generation for those services? They are clearly spending tens of millions of dollars (or more) on these ever-forthcoming products. Does Google expect me to stare at ads while I walk down the street while wearing their glasses or drive down the road with their cameras on the top of my car? Does Google expect everyone to act as roaming data collectors for their databases?
In the post announcing the shutdown of Reader, Google made reference to the usage of Reader dropping. Every change they made to Reader in the last three years made the service worse, not better. If usage has dropped, Google has only themselves to blame.
It's as if Microsoft and Google have entered into a contest to see who can make the biggest mistakes.
I'm often asked if anyone can rely on SaaS. Not if it is a free service, clearly. If such a popular service is dropped with little or no warning, less popular services must be even more prone to evaporation. Things are likely to get worse as the behemoths try to put up walls around their gardens.
(As a semi-related side note, when did Reader stop supporting OPML export? They support OPML import and OPML is the Lingua Franca of RSS aggregators. Google likes to brag about letting you take your data with you, but any data professional will know that the format of the data is almost as important as the data itself. If you can't import your data into another system, it's useless until you write a conversion tool. The careers of many programmers have been built on writing such tools. The vast majority of Reader users will not be able to write such a tool and they will have to rely on the kindness of strangers. I suspect that many of them will migrate their feeds manually, with many feeds being left behind and many blogs seeing a drop in pageviews.)
I am reminded of the del.icio.us fiasco from serveral years ago. That service seemed to come from nowhere and was wildly popular for a time. Yahoo bought it, screwed it up, and I (and many, many others) stopped using it. When Yahoo said that they were going to shut del.icio.us down, I migrated my data into Evernote and never looked back. Looking at it now, it seems that del.icio.us somehow got out from under Yahoo and is/was trying to rebuild what they had. Of course, they can't. There are elements of the Digg implosion in here, too. They had something good, they felt compelled to change it, their users found the changes were for the worse and they left. Surely, the Googlers are familiar with del.icio.us and Digg.
Looking at iGoogle now, I see that it is also marked for death and has already been eviscerated, so I won't be going back to that. I was surprised to find that Bloglines is still around. I've already moved a few feeds into BlogLines, but they don't seem to have any mobile app and I am still looking at this as an experiment. I may wind up with a few different solutions, each picked to follow certain types of feeds or to segregate "personal stuff" from "business stuff". For example, Shorpy's, a site for (large) historical photos, is better viewed on a desktop. CraigsList feeds often require a quick response, and I'd like to keep them on my phone.
My first steps towards migration are to cut down on the number of feeds I do have and to start trying other services. This morning, I tossed about 40% of my personal subscriptions. (Looking at CraigsList just makes me want to buy someone's old gear, anyway.) I am down to 31 subscriptions, from about 50. I'm not going to touch my 175+ "business" feeds yet. I'm starting by moving things to my re-invigorated Bloglines account that I don't need to read every day or really need to be viewed on a larger screen, like my The Big Picture and Shorpy's feeds.
Wednesday, February 20, 2013
Why Programming Environments Matter, Gaming Edition
As I do some posting on another site, I've been running the BGR live stream of the Sony Playstation 4 launch on my second monitor and I've got something to say.
I'm not a "gamer". I stopped playing games back in 1999 when I found that I had lost yet another weekend to Quake and Tomb Raider. I spent innumerable hours playing Quake, TR, Doom (and many others -- the late 1990s were kind of slow for me, socially speaking) on my Micron Pentium II. Before that, Falcon 3.0 (and Commander Keen and Wolfenstein 3D) on my Northgate 386/33. In medieval times, Ultima on my Mac Plus and Adventure on my Atari 2600 occupied my time. I was never all that good, but the games absorbed my brain in a way that TV never did. I had to stop playing games because I didn't do much else in 1999, besides go to work.
You know who I'm betting on.
I'm not a "gamer". I stopped playing games back in 1999 when I found that I had lost yet another weekend to Quake and Tomb Raider. I spent innumerable hours playing Quake, TR, Doom (and many others -- the late 1990s were kind of slow for me, socially speaking) on my Micron Pentium II. Before that, Falcon 3.0 (and Commander Keen and Wolfenstein 3D) on my Northgate 386/33. In medieval times, Ultima on my Mac Plus and Adventure on my Atari 2600 occupied my time. I was never all that good, but the games absorbed my brain in a way that TV never did. I had to stop playing games because I didn't do much else in 1999, besides go to work.
Since then, I have watched developments in game hardware and software from the sidelines. I do have some observations about the situation with the forthcoming PS4 and XBOX 720 consoles and I'm interested in how the competition between Sony and Microsoft plays out
It's interesting to me that the much-hyped (back then) "Cell" and "Xenon" microprocessors (Sony and Microsoft, respectively) have been dumped in favor of x64. The older consoles both used parts derived from PowerPC architecture. When those consoles were released, there was a lot of gushing over "lifelike" animation and how powerful the chips were. (I'm sure that the stuff shown at Sony's launch today will look awesome. A year from now, what you see today will look weird. Ten years from now, it will look crappy. People's ability to detect animation seems to improve just as fast as the technology improves. Much like nuclear fusion and rocket boots, real-time, life-like animation is always a few years off into the future.) Additionally, it was alleged that both companies minimized piracy threats by using an unusual (meaning not x86) architecture.
PowerPC chips were originally designed to be a scalable family of commodity, general-purpose parts. They were used in Macs, in various industrial situations and as on-board computers in vehicles. PowerPC didn't pan out and it isn't a viable option anymore. It hasn't been an option for serious computing since before Apple did away with them in 2006. Several years later, both high-end console companies are moving to x64. I presume that they have something up their wizardly (crypto) sleeves to clamp down on the pirates.
(I do wonder why they haven't gone with ARM. Is it that 64 bit ARM processors won't be viable until later this year? Is ARM still not "quite there" with computing power? I keep hearing that games are GPU-limited and not CPU-limited, so why would it matter ifthe CPU is a little "light" on power? Is AMD hurting so badly that it needs to sell parts on a long-term contract to help guarantee it's survival? Couldn't Intel just open the gates and flood Sony and Microsoft with underclocked parts having the performance of earlier-generation 32nm i5 parts? Maybe some souped-up Atoms? Surely Intel need to use those fabs for something and it doesn't look like it will be chips for cell phones.)
With the switch in processor architectures, the obvious problem is that you won't be playing any of the games you already own on your PS4 (confirmed by Sony during the product launch) or XBOX720 (unless Microsoft comes up with something wizardly before the launch). That means that you will need to keep your old consoles around. Since everyone who wants a gaming console seems to have one, that means a lot of old consoles will be around for a while. Games are going to have to get a lot better before the new consoles start moving in impressive volume.
Enough meandering -- here's the real rub:
One of Microsoft's superpowers is making things easy on developers. (Arguably, this is their best superpower.) Many of these developers will be coming from other areas of programming that Microsoft already dominates. They will be bringing their understanding of how to program the "Microsoft way" with them. Small learning curve. Easy. Fast-moving projects.
I have never heard anyone say that writing software for the PS3 is easy. It was hard to write for the PS3, it will be hard to port PS3 software to the PS4 (and take advantage of the PS4's improved feature set) and I expect that it will be hard to write software for the PS4 from scratch. Big learning curve. Hard. Slow-moving projects.
In short, Microsoft and Sony have pressed the reset button. The playing field is back to square zero and they are now in a drag race. Microsoft is much better at building development environments than Sony is.
The 6.4 billion dollar question is -- who will have more (and better) software, more quickly?
You know who I'm betting on.
Wednesday, February 13, 2013
Consequences of Casual Development
I did a fair amount of work with Lotus 1-2-3 back in the day, writing spreadsheets for other people. (I was paid and everything. Amazing.)
Tracking down errors in those spreadsheets was always a pain in the neck. Time has marched on, mercilessly. 1-2-3, Symphony, Jazz, Improv and even Quattro are just historical footnotes now. I haven't built a serious spreadsheet for anyone since 1991. I do use Excel, more as a "user" and not as a developer. I've never really dug into any of it's debugging features. For my own purposes, I use the relatively anemic Google Spreadsheet because it's good enough 90% of the time and I always have a browser window open.
I've often wondered (a nice word for "daydreamed") how people debug complex spreadsheets. Spreadsheet power users, while they can spell SDLC and know their way around a VLOOKUP(), do not strike me as the types who have backgrounds in TDD.
Apparently, I am not the only person who wonders and perhaps more people should wonder.
Enough of the meandering commentary...
If you are interested in software lifecycles, are following the BYOD/empowered-user movement or have simply wondered "Is it as bad everywhere else as it is here?", I suggest reading this (warning: some economy loonies in the comments) and some additional (#longread) commentary by IT people located here.
One of Microsoft's basic strategies has been to get developer tools into the hands of users and to simplify "programming" so that those users can be effective. They even invented a term for this new class of empowered users -- "power users". Excel is a result of this strategy. In addition to a rich language of functions, there is also a macro facility built on VBA. Word, Excel and Outlook are all automate-able tools for people who aren't professional programmers. Even Microsoft's Access database is more aimed at "power users" than full-fledged coders. (Plenty of people who wouldn't know how to start coding an IIS/SQLServer or LAMP web site are perfectly comfortable in Access. I'll even go out on a limb and say that Access works great for some projects. I've certainly said that before.)
Microsoft is continuing this strategy with it's latest "BI" initiatives. BI takes the drudgery of coding reports and puts it squarely in the hands of the users, which actual developers tend to like, while exploiting the power of modern computers. More on-target for this blog entry, it allows people to do even more with Excel. Great -- as long as we realize that this strategy is not without risk. Many users (and their managers) do not grasp why IT departments spend vast sums on QA. Perhaps those users need to lose vast sums before realizing that QA is an investment and not just an expense.
Subscribe to:
Posts (Atom)