I cast my vote late last week, in order to avoid the long lines that were expected (and are indeed materializing) at the polls today. Vote early, vote often, is my motto :-)
I'm not one of the many undecideds, but rather had made up my mind several weeks ago. Thusly robed in the extreme pleasure and honor of being able to cast a private vote in this democratic process, I strolled over to our local voting precinct - and waited about an hour to weave my way through the lines. When I finally got to the voting booth, I was surprised and delighted to note that our county had installed electronic voting machines. I wasn't able to read the label of the manufacturer, and I expected I'd draw some unwanted attention if I had reached around behind or under the machine to look. The use case for voting was really quite straightforward: the polling officer identified my precinct, picked up a block that matched my precinct and inserted it in the machine, bringing up the appropriate ballot for me. Voting was easy to do on the touch screen display, and changing votes/going back was even possible (I know, I intentionally explored the edges of the use case). I wish a paper copy had been created; it seems like such a simple thing to do and, in this era of hanging chads and such, seems to be a prudent safeguard. I was also surprised to see that the machines had no UPS device; they were plugged straight into the wall - one wonders what checkpointing is done in the event power fails. While I'm on this riff of surprises, I'm also surprised that there were no obvious parity checks: having a manual count of voters per machine and then matching them to votes actually placed would be another simple and obvious check and balance.
As the day unfolds, I'll be glued to my favorite Internet radio and then hosting an election party where we'll watch the returns.[Read More]
Software architecture, software engineering, and Renaissance Jazz
From archive: November 2004 X
I've told this story from time to time in my public lectures and I've decided to retire this tale, but before I do, I'll preserve it for reference in my blog.
My wife and I designed and built a home a few years ago, and being an alpha geek I just had to fill it with all sorts of automated elements. I hired a contractor to pull the wires (he put about 5 miles of Cat 5 wires in the walls) but as CTO/CIO of the home, I installed the rest of the network. Shortly after I booted the house for the first time, we invited some friends over for dinner. They arrived at the appointed time, rang the doorbell - but we never heard it. They knocked on the door - and we didn't hear that either - so they finally called us on their cell phone, while standing at the front door.
My doorbell had crashed.
Now, doorbells have very simple use cases: you push the button, it rings a tone inside the home. However, my implementation of said doorbell was a bit more complex, and I failed my user base by having the bones of the underlying technology stick through. You see, the doorbell sends a signal to our PBX system, which I hacked to extract events (such as the doorbell being pressed). That event gets routed to an application server - running a non-Macintosh, non-Linux operating system, I might add - which has a deamon that intercepts various events (such as from the PBX, the security system, and so on) and in this case would send an event to the A/V subsystem, where a seasonally-appropriate and pleasant tone would sound through the home. Alas, I failed to use Rational's own tools (Purify in this case) and I had a memory leak in my application server. The solution was to reboot that server, which brought the doorbell back to life.
I have a very demanding customer (my wife) who really doesn't like to have my software lying around on the floor, and so she was at first annoyed and then amused at the incident. The good news is that I've ripped out the first implementation (I'm not saddled by legacy software here) and my doorbell now works as any good little doorbell should, with all the complexity hidden below the surface.
Yet another example of why the primary task of the software development team is to engineer the illusion of simplicity.[Read More]
gbooch 120000P81R 612 Views
I'm back from holiday but am now 12 hours from wheels up again to the east coast. I have some thoughts regarding Microsoft's software factory initiative which I've not had time to post, but will do so before the end of the week.
Speaking of holiday, our family took a cruise, and while a Fun Time Was Had By All, I was most taken by the degree of automation onboard. Docking the boat appeared to be an almost hands-free operation and the stability system -even under moderate seas - were quite amazing in keeping the vessel upright (which I suppose is always an important use case). Doing a dig of a ship's operational system is on my list for the Handbook although I've not yet started to identify which one to study.
I've been in the midst of planning Rational's projects with IBM Research for the coming year. We invested several million for Rational alone this year and will do the same for next year in the areas of model-driven development, quality, change management, middleware tools, and collaboration. A few of these plays are, quite frankly, long shots, but you need to do a few of them to seed and nurture the future. Some of these efforts will bear fruit in terms of yielding tangible products, while some will not, but even from these we'll learn a great deal. The depth of talent inside IBM Research is really quite amazing, and it's quite invigorating to work with these folks who are scattered in labs across the world.[Read More]
I've back from travel again, this time from trips to New York City and Chicago where I've been working with a number of clients on their emerging enterprise architectures. The common theme I've encountered is that large enterprises are beginning to see their way out of the global economic slump and so are turning their attention to what they can do to extract value from their legacy systems by unifying the artifacts and activities that reside across existing silos and by unifying their customer experience.
Speaking of legacy systems, one gentleman introduced me to the new phrase heritage software as a euphemism for old, tired software. A nobel concept, but a rose by any other name is still a rose. It reminds me of phrases such as pre-owned vehicle and arbitrary termination of life.
Service-oriented architectures (SOA) are on the mind of all such enterprises - and rightly so - for services do offer a mechanism for transcending the multiplatform, multilingual, multisemantic underpinnings of most enterprises, which typically have grown organically and opportunistically over the years. That being said, I need to voice the dark side of SOA, the same things I've told these and other customers. First, services are just a mechanism, a specific mechanism for allowing communication across standard Web protocols. As such, the best service-oriented architectures seem to come from good component-oriented architectures, meaning that the mere imposition of services does not an architecture make. Second, services are a useful but insufficient mechanism for interconnection among systems of systems. It's a gross simplification, but services are most applicable to large grained/low frequency interactions, and one typically needs other mechanisms for fine-grained/high frequency flows. It's also the case that many legacy - sorry, heritage - systems are not already Web-centric, and thus using a services mechanism which assumes Web-centric transport introduces an impedence mismatch. Third, simply defining services is only one part of establishing a unified architecture: one also needs shared semantics of messages and behavioral patterns for common synchronous and asynchronous messaging across services.
In short, SOA is just one part of establishing an enterprise architecture, and those organizations who think that imposing an SOA alone will bring order out of chaos are sadly misguided. As I've said many times before and will say again, solid software engineering practices never go out of style (crisp abstractions, clear separation of concerns, balanced distribution of responsibilties) and while SOA supports such practices, SOA is not a sufficient architectural practice.
One more thing before I go. If I were a betting man, I imagine my ability to predict the future success of many of these organizations would be quite high (and I don't mean their technical success, I mean the very life of the company itself). There are some organizations I encounter in which there's a tight connection between the CEO and CTO/CIO (and development teams) - these are the companies I expect will flourish, for at the highest levels of the company they understand the strategic weapon that lives in software, and the importance in building a development organization that's able to exceute predictably and with agility. Sadly, there are too many organizations where the highest level of the company simply goes not grok the value of software - and these are the organization that will be overtaken.[Read More]