One thing that you start to notice as you move from working on strictly local projects (i.e. projects within your home country, in my case the United States) to global projects, is that dates and times are very tricky things.
I've noticed recently that whenever I have to write a date and time, it turns into a very long piece of text. Whereas I used to write something like:
4/2, 2 PM
I now find myself writing:
2 Apr, 2 PM Eastern US
Of course the reason is because different countries use different formats for dates, and times are relative to your timezone. In Europe, they would write April 2nd not as "4/2/05" but rather as "2/4/05".
The other thing to always remember is to tag any time with a time zone. If you don't include it, someone may think that your "2 PM" is in their local time zone, and be several hours ahead or behind of the intended time.
So in short, if you have to write a date, and there's any chance someone from outside your immediate timezone or country might see it, always write something like:
24 Mar. 2005, 2 PM Eastern US
It's a lot to write, but if you consider the cost of potential time wasted from key people not showing up because you were ambiguous, that extra text is a small price to pay. If you have a modern calendar scheduling program like the one in Lotus Notes, it will do the conversions in each attendees local calendar.
Just remember not to schedule a meeting with people in Beijing at 1 PM Eastern US, which is 2 in the morning email@example.com
Wow. I really got psyched by this article about a potential IBM service that would bounce spam back to the originating computer
. I love the thought of spammers triggering a self-inflicted denial-of-service attack on themselves.
I personally think we should create 10 copies of each spam email before spamming the spammer. The doctrine of massive retaliation applied to the world of firstname.lastname@example.org
Things on the IBM-Lenovo/PCD segmentation project are finally starting to stabilize, so I'm catching up on some reading this weekend.
First, I've been re-reading sections of Use Case Modeling
by Kurt Bittner and Ian Spence. My motivation for reading this book is two-fold: First, it's a very thorough book on many aspects of use case modeling, which I believe is a very good way to document a system's requirements. Second, I mentioned a while back that I'm working part-time with Rational on defining some requirements for future team products. Well the person I've been working with exclusively so far is Kurt (one of the authors of this book), so reading the book helps me to understand his modus operandi vis a vis requirements, which makes the work relationship more productive. If you model requirements with use cases (or are interested in doing so), you should give this book a read.
Another good book in this area is Writing Effective Use Cases
by Alistair Cockburn, which is probably the best starter book on use cases.
The other book I've been reading this weekend is the fairly recent Software Endgames
by Robert Galen. This book talks about the critical time between when you're almost code complete, rigorous testing is starting, and you have a looming move-to-production date. I'm reading this book because it's very relevant to where we are on the IBM-Lenovo separation project (3 weeks into end-to-end integration test) and also because I'm in the process of writing an article for The Rational Edge
on "effective change management on large software projects", which will *knock on wood* be ready for the May issue.
So far this book has been interesting and dense
with good advice on defect triage and change management. One thing I'm really impressed by this book is that I assumed (wrongly) that the author would focus on endgames in traditional waterfall projects. But he consistently talks about best practices' relevance to different software lifecycle styles (waterfall, iterative, eXtreme).
Finally, I've been making my way through this month's issue of The Rational Edge
, and have particularly enjoyed the article "A behind-the-scenes look at software development books
" which will be very interesting to aspiring book authors, like email@example.com
Hey everyone. We're looking for feedback to improve the IBM developerWorks blogs for our readers. If you have the time and the inclination, please participate in this quick survey
. It took me about four minutes to firstname.lastname@example.org
The March 2005 issue of the Rational Edge
is now online.
I wrote a couple of book reviews for this one: Bruce Schneier's Secrets and Lies
and Suzanne and James Robertson's Requirements-led Project Management
(links to reviews, not books for purchase).
Though I haven't had a chance to read it yet, I'm really
looking forward to Walker Royce's article on "Successful software management style
". Walker wrote a really good book
on project management a couple of years ago that is worth a read if you're a project manager or a technical leader on a software email@example.com
If you're not a Lotus Notes user you should stop reading now.
I'm personally a big fan of Lotus Notes. A lot of people have various gripes about it, but it's still a very useful platform and was revolutionary when first introduced.
One thing that was formerly difficult with Notes was pointing someone at a particular Notes document in real-time. It's quite trivial to embed a native Notes document link within a Notes email, but frequently during a phone call or instant message conversation, you want someone to look at a particular Notes document in a shared Notes database, and instant message programs don't understand native Notes doc links.
Luis Suarez (from IBM Netherlands) recently showed me how to do this and I'd like to pass it on to fellow Notes users out there.Prerequisites
Sending the link
- I'm using Lotus Notes version 6; not sure if this works in earlier versions of Notes
- The Notes address bar must be visible. If it's not visible, click some other toolbar (the stuff at the top of Notes under the menus) and check "Address". (see later screenshot if you don't know what the Notes address bar is).
- In Notes, go to the document you want to point somoene to
- Click the Address Bar drop down control (at the right of the address bar) to refresh the document's address (it doesn't automatically refresh when you open a new document)
- Copy the address from the address bar
- Paste the address into an instant message chat window and send (it should be something like Notes://SERVER_ID/DOC_ID)
- Tell the recipient to either click the link, or if the link isn't clickable, to copy & paste it into their web browser of choice (or the Notes address bar for that matter!)
- Notes will open the document from the link
A screenshot of the Notes address bar (circled in red) is below, if you're confused by what I'm talking about:firstname.lastname@example.org
For those of you Star Wars fans out there (like me) who are eager for Episode III: the full trailer
. (Quicktime required)
What the heck's a "trailer"? See this post
Alan Brown posted some comments
of the "Software Factories" book, written by Microsofties by Jack Greenfield
, Keith Short
, Steve Cook
, and Stuart Kent
I agree with Alan's comments. It's a really informative book that provides a thorough summary both on the evolution of software design styles over time and on the process of design. But the authors' tone when speaking about UML and MDA is way too negative for my taste. Still, in my opinion, this book is worthy of your money for the first couple of chapters on software design.
It's going to be really interesting to see the responses by the authors on their blogs over the ensuing days. This ties back to the whole "UML vs. Domain-specific Languages debate" which I mentioned a while ago
Lee Nackman (Rational's CTO) at EclipseCon
last week talked about Eclipse' history and future. I'd never heard the full story, so this is some nice background on a great tools platform.
I've pinged EclipseCon to see if they'll post the audio or text of Lee's speech. Slides are good to guide a speaker but bad for email@example.com
CNET has an article on the transition of IBM's PC Division from IBM to Lenovo
. The "separation" part is what I'm working on; after PC Division is separated, I won't be working with Lenovo or PC firstname.lastname@example.org
I originally thought this following story was too esoteric for my dW blog, but a friend I won't name
encouraged me to share it, so why not. It's going to take a while to get there, but I assure you this blog is
related to software engineering and may
Here it goes ...
One of the most entertaining and weirdest books I've ever read is The Illuminatus Trilogy!
by Robert Shay and Robert Wilson. In the book, there are two secret societies who have been fighting against each other throughout the history of human civilization. One crowd is bent on world domination and believes that it's their destiny to bring order to the world's peoples. The other group believes that the universe is naturally chaotic and that it's unnatural to try to bring order. This second group is essentially against things like hierarchies, power structures, and systems in general (as you may have guessed, this book was written in the early 1970's). This second group follow a philosophy they call Discordianism
and call themselves Discordians.
The Discordians named themselves after Discordia, the Roman goddess of strife and chaos. In earlier Greek mythology, Discordia's name is Eris
. Eris, of course, is most famous for (mythologically) initiating the Trojan war
So in The Illuminatus Trilogy!
, the Discordians have a mantra they say to celebrate instances of disorder and chaos. This mantra is "Hail Eris!"
How does this relate to software engineering? Well, if you've worked on software or systems engineering projects long enough, you've probably seen a fair amount of chaos. On most projects, there are usually several people who seem to do and say things that cause increasing levels of disorder. You know the type. The person who sends an inflammatory or uninformed email and copies everyone. Or the person who keeps the lid on a problem until it is too late to fix it.
For a long time, these types of people really stressed me out. But these days, I try to avoid the stress by finding humor in chaos-inducing actions. I do this by imagining that these people are actually just practicing Discordians, whose philosophy teaches them that disorder and chaos are natural and therefore good. So when I read the uninformed email that will cause untold confusion to the project, or when an executive declares a solution-breaking requirement in scope, I quietly repeat the Discordian mantra of "Hail Eris!" to myself.
Try it - it will make you feel email@example.com
Goran Begic of Rational and I recently co-wrote an article on Code Review
, a new set of static analysis functionality in RAD
that validates your Java code against industry best practices. Note that the best practices are implemented as rules, so RAD/RSA ship with several hundred best practices/rules but you can also define your own through the extensibility mechanism.
Anyhow, check it out if you're interested. Feedback, via blog comments or email, is very welcome. I'll write more on my involvement with Code Review, which actually goes back several years to earlier technology out of IBM Research, in a future blog entry.1:40 PM Eastern update:
I forgot to mention that new IBM blogger Wayne Beaton recently posted a blog entry
on Code Review that may be interesting to folks who are interested in the article above.
PS - I have a few more book reviews coming in the March
issue of the Rational Edge
which should be online within the next couple of days. This month I review Secrets and Lies
(recommended book for February) and Requirements-led Project Management
CNET has a glowing article
that more or less declares the Java tools platform "battle" won by Eclipse
I don't see how Sun can keep away from Eclipse for much longer. It looks like Don Box might end up being right with his prediction #2 for 2005
that Sun would embrace Eclipse. If Sun *does* end up joining Eclipse, I'll be closely monitoring (Sun President) Jonathan Schwartz's blog
to see if he mentions it and if so how he spins it.
Anyone want to venture a guess if/when Sun will start supporting Eclipsefirstname.lastname@example.org
As I mentioned in my last entry, I'm now working on the IT portion of the mega-project to segment PC Division function and data from the rest of IBM's IT as part of the sale of that division to Lenovo.
This is a large project.
Before this I had a nice little mental model of three sizes of "enterprise" projects:
- Application development where you work on one "application" (an ambiguous term indeed) focusing on it's functionality, quality, and interactions with external applications.
- Application integration where you're responsible for making a couple of applications work together, to support a piece of a business process. The focus here is on effective architecture for application connectivity. E.g. performance, data consistency, maintainability ... all that good stuff.
- Business-process integration where you're responsible for making a whole bunch of applications work together to support an end-to-end business process or set of related business processes, like going from a sales lead, through an e-commerce transaction, to order fulfillment. Here you're really more of a business analyst, and your large scale design decisions revolve around "which application should do what". And the "whats" you define get designed at the next level of details by a bunch of application architects.
I think that my work on PCD segmentation has led me to realize two new related categories of projects: "acquisition and divestiture".
These categories are of course two perspectives of the same business event. Company A buys all or some of Company B. From company A's perspective it's an acquisition; from company B's perspective it's a divestiture assuming that it only sold some of it's business, as is the case with IBM's sale of PC Division to Lenovo.
The reason that this is a new scale of project is that instead of focusing on several
large scale business processes (as is the case in my "business process integration" type), you're focused on all
business processes. This means that all
applications are in scope, and in the case of IBM (a Fortune 10 company), we have about 5,000 of them. As you can imagine, it's impossible for mere humans to reason about 5,000 applications.
How do we manage this kind of scale? That old management tool: hierarchy. Someone (I don't know who) made the good decision to logically divide the landscape by "towers". A tower in this context is simply the name we've given to describe a grouping of applications by business process area like "Market and Sell" (e-commerce apps) or "HR". We have twelve towers, each with lots of applications under them. In addition to the towers are "the geos": North America, Latin America, EMEA (Europe, Middle East and Africa), and AP (Asia Pacific) who are responsible for geo-specific or country-specific applications, whereas "the towers" are generally responsible for applications that are used in many countries around the world.
I'm on the program team, responsible for change management (this task deserves a whole other blog entry) so I work with the tower and geos, not applications. Whenever a large scale change request comes in, I work with the lead architect to determine which towers and geos are probably impacted based on the nature of the change, then work with those towers/geos to do change assessments to figure out if and how the change should be accepted. They go off and work with the project managers of the applications in their geo/towers and then roll it up to me. I roll it up to the project and deal executives who make the yes/no decision on the change.
So that's the nature of my current project. It's very time-consuming as you might imagine, which is why I haven't posted lately. On the bright side, I just had someone assigned to help me administer change management yesterday, so perhaps this will lead to a decrease in workload and more time to blog. But not likely; this seems like one of those projects where there's an almost infinite amount of stuff to do, so if my workload goes down around change management, new tasks will emerge to suck up the freed-up time.
Luckily, this is a project with a well-defined end point (whenever IBM and Lenovo declare the IT "separated") so this pace shan't last forever.
I hope it doesn't sound like I'm complaining about this job. I actually enjoy it in a weird way; the pace and complexity really make you focus. The only thing that's a tad frustrating is that the change management piece I'm working on is really more project management-oriented as opposed to technically-oriented, especially at the level of abstraction at which I'm working. But to be perfectly blunt, I have a good reputation for leading people and teams through complex problem-solving activities, and change management needed some help, so there you go. The glass-half-full side of the equation is that it's always good to see "the world" from different perspectives and this experience has really improved my knowledge and skill around estimation, which is a key skill both for project managers and technical leaders.
Speaking of which, I picked up Kent Beck's recently released Extreme Programming Explained, 2nd Edition
last night. As you probably know, one of the tenets of the Extreme Programming methodology is "embrace change". I want to read this book and assess the "embrace change" philosophy vis a vis a massive project like PCD segmentation, to figure out how far this philosophy scales. If I come to any sort of conclusion, I'll write it email@example.com
After almost a full month of blogging every day, I've only blogged twice in the past week. Why? A couple of things. One, my wife, my son, and I have all had colds, which take away a lot of energy, but two, I've actually started work on two different new jobs within IBM within the past two weeks.
The first job is working on the technical leadership team that's executing the separation of PC Division IT from the rest of IBM IT as part of the work being done to complete the sale of PC Division to Lenovo. The second job involves some product work with Software Group that I can't reveal under pain of death. All I can tell you is this: if completed successfully, it will have a more profound impact on the general human condition than New Coke
and the movie Waterworld
So back to job #1.
You might guess that the application portfolio and infrastructure for a $94 billion e-business might be a tad complex. You'd be right. To complete the divestiture we have to either physically or logically separate data that is specific to PC Division from the rest of the super-system. This isn't easy. The best analogy I can think of is that of a surgeon performing surgery on someone's body. But instead of performing a major piece of surgery on one organ (say the lungs) you have to perform moderate surgery on every organ in the body. And the patient has to keep breathing. And you have to complete it by a date stated in the divestiture that drives a lot of legal requirements.
So this is why my blogging has decreased recently. On the bright side, I have some really meaty technical stuff to blog about now. Over the previous six months, I haven't been very technical. I've been spending most of time trying to get funding for some ideas that I wanted to implement to help large IT organizations (like IBM) be more productive. I may write about "the hunt for funding" another time - it's not technical, but it's interesting in its own way. The good news is that the hunt ended happily, although not in the way I originally expected.
So anyhow, if you read this blog regularly, look for more stuff in the coming months on my IT-related work related to the closing of the PC Division sale. Some of the stuff I won't be able to write about, because of legal reasons or just general sensitivity, but there should be a lot of general "problems of big IT projects" that I can talk about freely.
Hopefully you'll enjoy this kind of discussion about work on a real large project. Not to sound derisive, but I think this is a view that's under-addressed in the technical blogosphere. Many of the people who are drawn to blogging work more on strategic issues which are more conceptual and less applied. I'm not knocking conceptualists at all - far from it - I'm just saying that there seems to be an uneven balance between "thinkers" and "doers" in the technical blogosphere. I try to keep a foot in each world. Working on real projects keeps you firmly grounded in reality, which is valuable, because things are always much simpler in theory and it's very easy to unfairly trivialize some of the underlying complexities that must be grappled with. Being conceptual, on the other hand, allows you to look past some of the non-essential complexities to understand the true nature of a problem, which frequently helps you to find a breakthrough solution.
Alas! This entry is starting to sound oh so very serious indeed, so I shall stop now. :-)firstname.lastname@example.org
found a Slashdot write-up
that explains how Google Suggest
works, which I'd been wondering about
The embarrassing thing is that in my original blog, I linked to a Joel Spolsky write-up on Google Suggest
, where he linked to the Slashdot explanation. I think I must have eagerly clicked through Joel's link to the Google Suggest webapp and never saw the Slashdot link at the bottom of his blog email@example.com
I've lately started on a project that's considering using an open source software framework to provide base functionality (no, it's not Eclipse) and as such have been belatedly studying up on the economic and legal implications of using open source software in commercial products.
One book that caught my eye was Understanding Open Source and Free Software Licensing
by Andrew St. Laurent. Luckily before I plunked down $20 for the dead tree version of the book, I discovered a free (and legal!) version available from O'Reilly's Open Book Project
There are a number of current books as well as a number of out-of-print books. Of the available books, three of them are related to open source and I will probably read them online here in the coming couple of months. They are:
I also plan to read a paper called "Reusing Open-Source Software and Practice"
by fellow IBM bloggers Alan Brown
and Grady Booch
I've been reading a bunch of articles about HP over the past couple of weeks, and especially today with Carly Fiorina's departure
. One recurring theme in many of these articles is the "conventional wisdom" that HP should be broken up to realize true shareholder value.
Somehow this all sounds strangely familiar.
In the early 90's when IBM was hemmoraging cash and struggling to survive, the conventional wisdom that IBM should be broken up to realize true shareholder value. Of course everyone knows what happened. The IBM board hired Lou Gerstner
to come in to save the company
. After not very much time on the job, Lou bucked conventional wisdom by saying that IBM's strength was in its comprehensiveness of products and solutions, and could serve as a one stop shop to deal with customers' true IT need ... integration
In hindsight, everyone recognizes that Lou made the right decision to keep the company together. These days, we have the capability to do everything to support an enterprise's transformation: consult
with C-level execs, sell mainframes
, distributed servers
, POS systems
, and best of all, hook it all up for you and make it run
, and keep it running
at 99.999% uptime.
No, I'm not trying to sell you on IBM's value proposition; we have lots
of marketing and sales folks to do that. I'm just saying that there's a lot of positive things to be said for a full-service provider. In fact, many people feel that what Fiorina was trying to do all of these years was to copy IBM's strategy by turning HP into a full service provider. Don't forget that the year before we bought PWC consulting, HP was trying to buy them so that they
(HP) could do C-level consulting.
But alas, here we are. Fiorina's out and many analysts are calling for HP to be broken up, with much emphasis on the highly profitable printing business. Yet they rarely mention the lesson that IBM learned in the early '90s.
But one thing to consider is that IBM was very lucky to get a leader like Lou who not only had the courage and vision to buck conventional wisdom and make unpopular (but correct) strategic choices, but was also able to put the management system and right people in place to realize the strategy. Because after all, goals without the ability to execute are simply firstname.lastname@example.org
Carly Fiorina resigned this morning
from her job as CEO of Hewlett-Packard. HP of course is another huge IT company and is probably IBM's closest competitor in terms of size and breadth of offerings.
Here's a cached version of Fiorina's HP biography
, since HP already took the official one
A quasi-frequent topic of discussion in IT land is and "What is software architecture?". There are many opinions and strong feelings on this topic.
Simon Johnson eruditely quotes a 2000 year-old book
on building architecture that basically says that it is practically impossible to define architecture because of the inconsistent usage of the vocabulary on which the discipline is built.Some SEI gurus
The software architecture of a program or computing system is the structure or structures of the system, which comprise the software elements, the externally visible properties of those elements, and the relationships among them.
In a similar formal manner, IEEE and RUP define architecture as:
The highest level concept of a system in its environment. The architecture of a software system (at a given point in time) is its organization or structure of significant components interacting through interfaces, those components being composed of successively smaller components and interfaces.
Ralph Johnson dislikes this definition as too focused on the development team's view point. He prefers something simpler, along the lines of "architecture is whatever the expert developers say the important stuff of a system is".
Martin Fowler takes Johnson's view as a starting point and describes two varieties of architects
: One is Architectus Reloadus
, a snooty fascist who likes to make all of the decisions, uses fancy words (like "eruditely"), and quotes specifications, all in an attempt to intellectually dominate (Reloadus
, of course, was inspired by the snooty fascist who babbled incoherently
at the end of "The Matrix Reloaded
"). The other type of architect that Fowler identifies is Architectus Oryzus
who is more a guide for the development team, rather than a dictator.
What do I say an architect is? Well, honestly, I try not to. When non-techie people ask what I do for a living, I say "I design software systems for IBM". Usually their eyes glaze over in a manner that leads me to believe that it wouldn't make much difference if I replaced "I design software" with "I architect software".
When a techie person asks me what I do, I still say "I design software systems" because many people in-the-know react badly to others who proudly describe themselves as architects, both because of the negative connotations associated with the Architectus Reloadus
stereotype and because to call oneself an architect, when you possess only ten years of experience in the software field (as I do), is more than a tad pretentious when you consider the accomplishments of people like Rob High, whose title is "Chief Architect of the WebSphere platform" or Dave Cutler, who led the designs of both VMS and Windows NT.
So in summary, I don't like to call myself an architect, even though my responsibilities on projects frequently meander into the territory described by the definitions above, just because of the ambiguities and negative connotations associated with the term. I just call myself a software designer, and occaisional programmer.
Still, I use "architect" often in everyday conversations because it's part of the standard vocabulary, ambiguous as it is. It's a formalized role in Global Services and Software Group, and it's even part of a key product's name: Rational Software Architect
, which unlike the term "architect", email@example.com
As I type this, it's February 8th, 11 AM on the US East Coast which means it's midnight on February 9th in Beijing, China. It's a very special day over there because it's Chinese New Year.
I believe that there are a few readers of this blog who live in China so I just wanted to say a quick "Xin1 Nian2 Kuai4 Le4!" ("Happy New Year!") to you.
For non-Chinese readers who don't understand the meaning of the numbers next to the Chinese words, they represent the tones of the Chinese language. The same syllable in Chinese can mean very different things depending on its tone and its context in the sentence.
For more on Chinese tones, check out this article
PS - Thanks to my Tai4Tai4 (wife) for help with the new years firstname.lastname@example.org
CNET has a fascinating article
on an emerging market where real money is used to purchase virtual resources in virtual environments like video games.
So it's now possible for someone to support their physical needs (food, water, clothing, shelter) by working in a purely virtual environment. At what point will people be able to have their virtual environments take care of their physical needs as well?
It's an interesting hypothetical question that many works of science fiction have dealt with and which The Matrix
popularized: If you had the choice to either live in a virtual environment where you could lead a spectacular lifestyle (e.g. a king, a president, a rock star, Grady Booch
, or some other type of supernatural being) or to live in the real world where you're just an "ordinary" person, which would you email@example.com
I recently updated the "Blogs I read" portlet on the right-nav. I especially recommend Ivar Jacobson's "postcards"
, which is more or less a blog. Jacobson, who was a key player in the creation of both UML
and the Rational Unified Process
(RUP), always has interesting and insightful things to say. I especially enjoy his comments on the RUP, which he both defends and criticizes, depending on the topic.
Now if he'd only add an RSS feed ...bhiggins at us.ibm.com
Conan O'Brien seems very uncomfortable during this presentation with Bill Gates
at CES where the technology to be demonstrated kept failing.
Somehow when I see Gates in that big chair he reminds me of Dr. Evil
. I wonder how close he came to pushing the button that would have dropped Conan into the hot fiery pit beneath the stage.
"Let this be a reminder to you all that this organization will not tolerate failure."
would have impressed (and frightened) the CES crowd!bhiggins at us.ibm.com
Like many "knowledge workers" these days, I primarily work from home, dialing into conference calls and connecting to IBM's intranet via a virtual private network (VPN) client.
This has some advantages, e.g. you save time driving to and from work and you save money on lunches and coffee.
But sometimes it's challenging.
The other day I got a box in the mail containing the Xbox video game "Star Wars: Knights of the Old Republic II: The Sith Lords
" (which I naturally bought on the web in used condition for 35% off retail).
One of the advantages of working on site is that you don't
have to walk by your HD TV and your Xbox and see the case of the new video game you just bought staring at you saying, "Play me Bill. Just for a few minutes. No one will notice. Just leave a message on Sametime that you're on a call".
But this path leads to the dark side. For even though I'd only intend to play for 20 minutes, soon it would be an hour, then two, and pretty soon I'd be writing my farewell blog entry.bhiggins at us.ibm.com
I want to take a moment and explain one thing about this blog that might seem odd to regular readers.
I write about books frequently and link to two different web sites where you can buy them. The two book vendors I link to are:
Here's the deal. IBM developerWorks has a business partnership with the Knowledge Resource Center (KRC) to sell books, so if they carry the book that I'm discussing, I will link to them. This is IBM's preference, so since this is also IBM's web site, it becomes my preference.
Sometimes, especially in the case of business books, they don't have the book, and in this case I will link to Amazon.com, because I think they have excellent information on their web site and carry a huge catalog of titles.
Now, a couple of things. IBM has a business relationship with the KRC, so if you click through to them and buy the book, (I assume) IBM makes some money. Also, as I mentioned in an earlier post
, I own some Amazon stock, so if you click through and buy from Amazon, you are helping my stock portfolio, albeit in a very small way.
If either of these two facts bother you, feel free to click through, make sure you have the right book (ISBN number is the best means), go to your favorite bookseller (virtually or physically), and buy the book there.
I'm not trying to dissuade you from buying from KRC or Amazon; on the contrary I'm perfectly happy if you make IBM money or help one of my stocks. But first and foremost, I want to make sure that readers trust this blog to be open and honest.
Now that I got that off my chest, let me point you to the right-nav, where I have updated the "Recommended book" link for February, 2005. This month I'm recommending Bruce Schneier's "Secrets and Lies", which I will discuss in a later post.bhiggins at us.ibm.com
Joel Spolsky has a fun and insightful write-up
that claim to hire "only the top 1% (or less!) of developers".
If you missed my earlier post
, I highly recommend Joel's recent book
.bhiggins at us.ibm.com
With much marketing bravado and fanfare
, Microsoft has officially launched its rejiggered MSN Search
First reaction: "Wow, it looks a lot like Google
Question: have any of Microsoft's major products ever resulted from an innovation that Microsoft either invented (as IBM invented the relational model
) or at least popularized (as Apple popularized the graphical user interface with the original Mac
)? As far as I can tell, Microsoft has never really invented a new product. Rather it takes other companies' good ideas as a starting point and uses its world-class user-centered design methodologies to improve them
. Or in the case of truly entrenched competitors, they bundle them with Windows for free
But the ultimate question is, does this matter to customers and stockholders? I don't know. I am a Microsoft customer since I use several Microsoft products like Windows, Word, and Excel primarily because those are what came with my Thinkpad, and I am too lazy to replace Windows with Linux and too cheap to buy a Mac. From a bottom-line perspective, Microsoft's stockholders must surely prefer that Microsoft "embrace and extend" their way to massive profits rather than inventing and failing to capitalize on a new idea, as the massively innovative Xerox Parc thinktank continuously did in the 1970's
I actually have a lot of respect for the engineers at Microsoft. Some of my professional heroes like Dave Cutler
and Clemens Szyperski
work there, and I've learned more about distributed computing through the writings of Don Box
than through any other source. But sometimes the attitude and tactics of the Redmond marketing machine get under my skin.
Speaking of which, according to this blog
, "[MSN Search is] the biggest global campaign since the introduction of the MSN butterfly".Serenity now!bhiggins at us.ibm.com
recently turned me on to The Cluetrain Manefesto
. One of the many theses of the book is that when it comes to business transactions, the web shifts the power relationship from the supplier to the consumer.
I saw this in action the other day while shopping in Target. I was in Target to see if they had any new Star Wars toys (yes, I am an incorrigible dork in this regard) and noticed an advertisement on an overhead monitor for The Lion King
. The gist of the commercial was:
- purchase the breathtaking majesty that is The Lion King for your family to enjoy for years to come
- we will soon put The Lion King "back in the Disney vault", so buy it right away, while you still can
At first I felt the properly conditioned response: "Oh jees, The Lion King is supposed to be like the ultimate in animated kiddy-fare, and I have a one year old ... I better bee-line to the video section and buy my copy of The Lion King so that my child won't be deprived and resent me later".
But then my common sense came back to me. Just two nights earlier, I had bought Pinocchio
, which is already "back in the vault", via Amazon.com's marketplace feature [full disclosure: I own some Amazon stock]
. I.e. some other Joe Schmoe parent bought Pinocchio from a store and decided later to sell it via Amazon. Since Amazon is essentially a global marketplace, there were about 30 other sellers hawking the same exact DVD. This drove the price down to a very reasonable $24, plus $2.50 shipping. And I got the DVD three days later, not several years later when Disney marketing decided it was time to re-release it from "the vault".
So when I saw the Disney advertisement recommending an immediate purchase of The Lion King, although my first thought was to BUY NOW, my second thought was ... maybe later ... via Amazon (or half.com
, or eBay
, or wherever the price is right and the shopping experience is convenient). The big thing is that the power in determining when we may buy a certain movie has shifted from the supplier (Disney) to the customer (me, and thousands of other parents). And thanks to the simple distribution of the web over the Internet and the power of the postal service, anyone from San Diego to St. John's, Newfoundland has access to the same marketplace.
One thing that's really becoming more and more apparent, by conversations with Alan and others, reading the Cluetrain Manefesto, and just living with the web, is that many existing marketing mechanisms become less and less effective as customers gain greater access to better information about products and simple access to secondary markets.
This is one reason why I wish this blog got more comments. I see the hit counts ... there are people out there, but mostly in passive mode. I'd really like this be a place that people feel they can come for quality information and honest insight on software technology and the software industry. But I'd prefer that it be a conversation, not a broadcast. Once again I commit that if you make a comment or ask a question, I'll respond honestly, even if my response indicates that IBM isn't 100% perfect, 100% of the time (the de facto marketing position). This isn't purely an altruistic gesture - I want to help IBM improve its reputation as a company that "gets the web" and communicates with its partners and customers like intelligent humans, not as passive consumers.
If you would rather communicate via an alternate channel, I'll put my email address at the end of every blog entry from here on out (using some funky syntax to avoid spambots
- Bill (bhiggins at us.ibm.com
2004.01.28 Update: Here's a good write-up on getting started with Bloglines. Thanks to Bob Sutor and James Governor for the pointer.
I switched RSS readers from Feedreader
, as per Bob Sutor's
recommendation. It's got some really nice features that I like, the key one being that you can which blogs other folks read, which is a great way for finding quality content in the large sea of information.
I get emails occaisionally from readers of this blog who for whatever reason don't leave comments (hi Clemens). Based on this anecdotal evidence, the readership of this blog consists of two general demographics:
- technical professionals with an interest in software engineering
- friends and family, many of whom are not technical professionals
You should stop reading now if you know what an RSS reader is.
For the non-technical friends and family out there (like my mom) who are not familiar with RSS readers, here's what I was talking about at the beginning of this post.
As you can see from the right side of this screen, I regularly read quite a few blogs (fifteen at the time of this writing). It's time consuming to jump from web site to web site to read all of the blogs, especially when many people update their blogs only occaisionally. Someone, somewhere, sometime in the past decided that this was a big waste of time, and invented a technology to simplify the aggregation of frequently changing content that one is interested in. This technology is called RSS (really simple syndication).
RSS is quite simple: any content provider can create an "RSS feed" which is just the regular content (whether a blog or set of news stories) in a format that a computer programmer can understand. Individuals, like myself, can then download and install an "RSS reader" and point it at all of the content providers in which they are interested. So my RSS reader points to all of the blogs on the right side of the screen as well as a number of news sources (like CNET News
and The New York Times
). So instead of going out to all of the different web pages, I just go to my RSS reader and read all of the new content that has been created since the last time I checked.
It's incredibly handy. If you're feeling adventurous, I recommend checking out Bloglines
. Then as you go to different web sites, keep your eye out for little buttons that indicate an RSS feed is available. You can then add that web site to your list of content providers, and the information will be brought to you, rather than you having to go to it.
Here's a longer introduction to RSS
if you're interested. Beware, it's somewhat technical.[Read More
Vinay posted a response to my last entry
, listing the following as key criteria for the success of always-on global connectivity. My response got lengthy, so I decided to turn it into a new blog entry.
First Vinay's original list:
- Security (Wireless)
- Application performance (Wireless)
- Various forms of UI (WAP et al)
- And yes, will we have to "come" to work in future (Will there be a physical office)?
Here are my comments (in italics).
- Security - Agree - it is my understanding that popular current wireless security is pretty weak (though I am not an expert in this area).
- Application performance - Not sure I agree that this will be a big issue - currently the only scenario I find wireless performance lacking is when I need to transfer massive amounts of data (like the Rational Software Architect installation files) from another computer on a local area network. In this case wireless is the bottleneck and I plug in to a land-line for 100 Mb - 1 Gb/second performance. But if I'm transferring large files over a wide area network (i.e. the Internet) then the broader network is the bottleneck; not my wireless connection. If we're talking about a truly global wireless network, I guess I would be curious to find out what current transfer rates are like (e.g. for people who connect via a celluar connection vs. a wireless router located 20 meters away). I think that if the global wireless network of 2010 could reach current local area wireless network speeds of 11 Mb/second, most would consider this adequate performance. But this might be too presumptuous of me - perhaps in the future we will routinely send much richer and larger content (such as video) over the network, making greater performance necessary.
- Various forms of UI - Agree, but I'm an idiot when it comes to devices. I need to study up on this topic. I know that IBM sees this area (which we call pervasive computing) as a huge growth area over the next couple of years.
- Coming to work - I currently work from home, so to me this would be nothing new. That being said, there is a lot to be said for face-to-face collaboration. This isn't a technology issue - it's a human sociology issue. This is actually what I'm currently working on - "How can software development tools make geographically distributed teams (whether on the other side of the building or the other side of the Earth), work better together?" (If there are any computer science students out there reading this blog, I seriously encourage you to take classes in sociology!)
For an excellent read on some of the issues related to software development and distributed teams, I recommend you check out the paper "Collaborative Development Environments", by fellow bloggers Alan Brown
and Grady Booch
. It was written several years ago and was way ahead of its time and full of interesting ideas. It's available from Alan's personal website
Also, the IBM Cambridge Research lab is currently doing a large amount of "inventing the future" in this area. You can read about some of their efforts here
(one of my favorites is "Jazz
, Grand Pooh-Bah of IBM software architecture, writes about wifi access on his transatlantic flight
This makes me wonder (as I commented in Don's post): how will software development and software usage change when we can (more or less) guarantee network connectivity, everywhere, all of the time?
I have seen some solutions and proposed architectures to make web applications available for use in a disconnected mode, but will these still be needed if user connectivity is guaranteed?
The more interesting and general point is how technologies change in unexpected ways when major underlying assumptions disappear. For instance, in the mid-1990's, Microsoft and others invested large amounts of money in CD-ROM (and later) DVD-ROM based multimedia products such as Cinemania
, Complete Baseball, and Encarta
. The business case for these products was that people interested in some domain would pay good money to have access to large amounts of structured information. The underlying assumption was that CD/DVD-ROM was the best delivery mechanism because of the vast amounts of storage required. But when the Internet/Web became ubiquitous and made sites like IMDB
possible, it killed the underlying assumption that you had to package and distribute that information as $50-$150 disks. Sayonara
multimedia reference business.
How will guaranteed connectivity change the way software is created, and how will it affect current products? I don't know! Does anyone out there have any ideas?[Read More
Well this is turning into a real Windows month!
Something like 95% of desktop computer users use Microsoft Windows
, and anyone who uses it long enough knows it has some quirks
. If you're interested in "the rest of the story" behind those quirks, I recommend checking out Raymond Chen's blog subset on the history of Windows
. He's been a programmer on Windows for quite a long time, so he knows the rationale behind some of the weirdness in Windows. The interesting thing I've learned from Raymond's blog is that for every bizarre behavior of Windows, there is usually a very pragmatic design decision behind it - usually a trade-off required to maintain backwards compatibility from an earlier version of Windows or DOS.
So if you're interested in trivia straight from the source, check out Raymond's blog on Windows history. He also has info on other topics, like Win32 programming, but I'm not that interested in those things, so I've only linked to the items on Windows history.[Read More
IBM announced its earnings and revenue
for the 4th quarter and for all of 2004, beating expectations.
A few key financial stats for 2004:
- Revenue: $96.5 billion (that's billion with a 'b')
- Earnings: $8.4 billion
- EPS: $4.94
- EPS growth vs. 2003: 14%
Note that IBM's numbers lose a little luster when you account for currency fluctuations, but a great year and quarter nonetheless.
And in the spirit of full disclosure, it should surprise no one that I am an IBM stockholder :-)[Read More
While I'm still a Windows
user, I thought I'd pass along a feature I find useful that not many people seem to know about.
There are a number of programs on any user's desktop that they use more frequently than others. There are also a number of mechanisms for launching programs from the Windows desktop. This tip suggests is a way to launch frequently-used programs as quickly as possible manner, to save time and aggravation.
The quickest way I've found to launch programs in Microsoft Windows is by mapping shortcut keys to frequently-used programs. This is accomplished by assigning a particular alphanumeric key (0-9, and A-Z, case insensitive) to a particular program shortcut. You then launch the program by hitting Ctrl-Alt-[KEY]
I'll demonstrate this with Lotus Notes, a program I use frequently.
- From the desktop, click Start -> Programs -> Notes, and right-click Lotus Notes, and select "Properties" from the context menu
- Go to the "Shortcut" tab
- In the "Shortcut key" field, type any alphanumeric character (preferably that you haven't used for another program)
- Click "OK"
- Hit Ctrl-Alt-[character from step 5] to launch the program.
This may seem trivial, but over the course of a year, I bet that this saves me hours.
Here are my keyboard mappings, for your profound amusement:
|V||IBM's VPN Client|
Note that if you move a shortcut in your menu, you'll have to recreate the mapping (I'm going to ping Raymond Chen
to find out why this is). Also, I'm not sure what will happen if you have the same key mapped to multiple programs.[Read More
About a week ago, I sent an email to a product manager for an important IBM software
product, encouraging him to blog. I pointed to Microsoft as an example of people who use blogs effectively to inform their user communities about upcoming products.
He pointed out that it is his group's explicit policy not
to talk about future products until the release was several months away from release to manufacturing.
Software projects are volatile things - there are many potential events that may force you to change scope or schedule of the project. A few of these events:
- an engineering problem
- budget reallocation away from your project
- a shift in the marketplace
- a competitor's announcement
So if you talk about your future product to the press or via a blog, it's easy to make implicit commitments - features or schedule that may be in flux internally, but once published becomes concrete in the customer's mind. And those commitments can lead to customer dissatisfaction and press angst when something about the project changes.
I have mixed feelings on this. On the one hand, I think any feedback from customers is valuable, and blogs open you up to the whole Internet community (as opposed to a focus group selected by marketing). On the other hand, I can understand the desire to avoid the reputation of someone who makes and breaks commitments on a regular basis. Not to mention that if you start talking about features a year ahead of RTM, that becomes very valuable (and incredibly cheap!) competitive intelligence for your rivals.
What made me think of this was this article
on current happenings in the next major release of Windows (internally called "Longhorn"), and read the following quote:
What's not up in the air, however, is Longhorn's ship date. The company is now committed internally to shipping Longhorn in May 2006.
Note that this quote says "committed internally
" (italics mine).
Microsoft has really been burned in the past
for committing to schedule and/or features way too early (the opposite of what I describe above from the IBM product manager). Sometimes this tactic is used to FUD competitors (e.g. pre-empting a competitor's actual product release by announcing competitive vaporware). But other times it's just plain enthusiasum for a future product.
I'm really not sure what the right answer is to this question, or even if there is a right answer. My gut feeling tells me that, the more input and feedback from the customer community, the better. Maybe its just a matter of putting up appopriate disclaimers.
In the meantime, it will be interesting to see if Microsoft can hit its internal May 2006 commitment on Longhorn. My bet is probably not, unless there's a heck of a lot of buffer time in the next sixteen months to deal with unexpected stuff. But since it's not a published commitment, they should have the freedom to move that date at will.
By that time I may be a Mac
user anyway.[Read More
This blog isn't very techy, so feel free to read no further if you aren't interested in business stuff.
In 1997, Jim Collins and Jerry Porras published an influential book called "Built to Last
" which described characteristics of 18 "visionary" companies whose stock had done amazing things over a period of time. The key characteristic, they determined, was that each visionary company had a set of core values which management and employees bought into in a big way.
IBM was one of the 18 visionary companies. But at the time of the writing of the book (between 1995 and 1997), IBM was only starting to break out of the doldrums
of the late 80's early 90's which almost killed the company.
In an appendix of the book, Collins and Porras wrote almost an "open letter to IBM" talking about how it must rediscover its values if it wanted to regain its spot as one of the most respected companies in the world.
I don't know if our CEO, Sam Palmisano
, read this book or not, but it seems like he must have. Last year, IBM held a very unique "Values Jam" where we used web message board technologies and text analytics to update IBM's values for the new century.
That's enough of a preamble. You can read "the rest of the story" from a really excellent article/interview from December's Harvard Business Review
, which is available for free viewing on ibm.com.[Read More
After my earlier tome
, I thought I'd post something short.
The Rational Edge folks have published their January issue
which is not new content but rather "the best of 2004". Lots of good stuff.
Happy weekend.[Read More
I've been following the evolving story
about how the F.B.I. may have to scrap a $170 million computer system overhaul because users are rejecting it.
Compared to something like biology or civil engineering, software engineering is a young field - roughly 50 years old - yet in that time we have discovered a number of best practices that tend to result in the right system, on time and within budget. So why are stories about spectacular software development project failures so common? George Santayana
said it best and most famously: "Those who do not study history are doomed to repeat it."
Well-known lessons and best practices are often either not known or ignored. The result is that every year or two there is some spectacular software failure, and everyone in the media holds their hands up and says "Oh dear me, whatever is the matter with the software industry?" But when you examine what exactly went wrong on said disaster project, it's the same exact stuff you would have found on a disaster project from the 1960s!
I want to look at some specific problems mentioned in the NY Times article, but before doing so I want to preface my analysis with some humility. Even though the problems listed below are well-known and tactics to deal with the problems are well-known, I have no doubt that the people working on the F.B.I. project (both from the F.B.I. side and the service-provider side) were working to the best of their ability, and have probably worked many hours and made many sacrifices in an attempt to successfully deliver that system. To quote Grady
, "Software development has been, is, and will remain hard". It's much easier to play armchair quarterback, as I am about to do, than it is to successfully deliver software projects. I completely empathize with those involved, because I've been a part of failed projects, and it's not fun. Still, I think it's important to look at these well-known problems, and talk about tactics to deal with them.
Some quotes from the NY Times article
One idea under strong consideration is for the bureau to use off-the-shelf software instead of the expensive customized features it has unsuccessfully sought to develop.
It is a natural tendency to want to create software from scratch. There are two main factors: 1) exactness of capbilities and 2) ability to make modifications. However, quality software is hard to create, and many well-known problems have been solved by component vendors to a high-enough degree of quality to make the costs and risks of custom software highly untenable. In specialized domains, you will still need to write some
custom software. But for any part of your system that is not of unique value to your users, you should look to off-the-shelf components first. For instance, you do not want to spend many person months creating and debugging the world's 5,000th object-relational (O-R) persistence layer - you want to find an off-the-shelf persistence technology that works for you (like Hibernate), learn how to use it effectively, then drive on.
For more information on component-based development, Clemens Szyperski's excellent book: "Component Software: Beyond Object-Oriented Programming, 2nd Edition
" is strongly
recommended. Also one of RUP
's best practices is to use component architectures - so there is a lot good information in RUP on this topic.
"I did not get what I envisioned" from the project, the senior official acknowledged. But he said the F.B.I. today had a better understanding of its computer needs and limitations as a result of the effort." The lesson we have learned from this $170 million is invaluable," he said.
WHOA NELLY! Whenever you see something along the lines of "I didn't get what I envisioned" or even worse "I got what I asked for, but now that I see what that is, I realize I really wanted something different" you should immediately ask, "Did you do prototyping?" and "Were the actual system users involved in reviewing the prototypes?" It is very difficult for a system's future users to specify the system they truly need. As Suzanne and James Robinson point out in their recent excellent book, "Requirements-led Project Management
", this is generally because users don't think in abstract, essential terms, but rather their minds are "muddied" by prejudices about implicit technical constraints imposed by technology they understand - usually the current system. It is the analysts' job to find the essence of the problem at hand, rather than jumping to a solution without truly understanding the problem.
Once you have a fairly good idea of the true problem to be solved, the next step is to create some form of the actual system that can be seen and preferably touched by the actual users. This is simply because a visible working system is a much better point-of-truth than a bunch of words in a written specification. The key thing is that the first version of the system that the users touch should not be the final system
. This is for a very simple reason. The first attempt at the system will be wrong!
So instead of spending $170 million to determine "well, that's not exactly what we want", it is much better to create light-weight prototypes (using use cases, executable code, HTML, or pen and paper) to show the user what the envisioned solution to their problem will look and feel like. They will immediately spot deficiencies.
The difference here is that you've only spent a few thousand dollars on the "wrong" prototype rather than many millions of dollars on the wrong system. From the user's feedback you can quickly modify the prototype. It will still not be perfect, but it will be less wrong. And by the time you're getting to real code intended for production, the system will start to resemble the users' underlying intentions, rather than their original misconstrued interpretation of those intentions. If some ill-informed executive tells you that you either don't have time or money to do prototyping, write up a risk assessment using some historical data, and ask them to formally take ownership of the risk. It is likely that he or she will discover time and money to allocate to prototyping.
"As recently as last May, the F.B.I. was still claiming that V.C.F. would be completed by the end of 2004, and that it would at last give the F.B.I. the 'cutting-edge technology' it needs," the senator said.
Along similar lines to the previous item on prototyping, another well-known best practice is to incrementally deliver working versions of the system in relatively short, time-boxed durations, rather than trying to deliver the final system in one "big-bang delivery". The rationale for this is two-fold.
The first item is similar to prototyping - when user's get a working version of the system early, they're going to immediately spot problems with it - I'm talking problems with how the system "feels" - not bugs that can be caught in test. Hopefully you got 90% of these problems out during prototyping, but the first delivery of the system (unless you did a very sophisticated prototype) will be a new level of detail and immersion for the user, and they will inevitably dislike things about how the system works. It is much simpler to make adjustments mid-course if you have only implemented 10% of the total number of planned features, than if you have delivered the entire system (especially if the change requires you to tweak major system plumbing).
The second, and probably more crucial benefit of incremental delivery, is that it forces you to integrate the total system before you have written massive amounts of code. Integration is very hard, and in the big-bang approach it is frequently put off until near the end of the entire project. This can lead to a situation where you're "80% done for 200% of the allotted project schedule". It is very hard to speculate if this was the cause for the quote above, but if I were a betting man, I would bet that when system development was being advertised as on target, true integration hadn't happened yet. The RUP best practice "Develop iteratively" describes techniques around this area very well.
Well, enough for this post.
In closing, immerse yourself in software literature, and try to avoid the mistakes of your predecessors. To quote Benjamin Franklin: "Experience is the best teacher, but a fool will learn from no other."
In a later post, I will list some books I consider essential reading on software development. If you are into the literature, you will not be surprised by the list.
PS - I mention RUP several times as a place that defines and describes these best practices. I realize that RUP does not have a monopoly on these ideas - it is simply the process with which I am most familiar.[Read More
As I mentioned previously, I've been tracking the debate on IBM Rational vs. Microsoft vis a vis UML vs. Domain Specific Languages (DSLs)
. One of the combatants from the Microsoft side is Jack Greenfield, who now works for Microsoft, but used to work at Rational where he was in charge of desktop development tools back when XDE was new.
I found a circa 2002 video interview
with him from theserverside.com, where he talks about "The Future of J2EE Design and Development". Just for fun, I wanted to see if he said anything really quotable in favor of J2EE, in light of his new home in Redmond.
But it wasn't to be.
I started the video, and saw the IE embedded Windows Media Player doing something or other, but when the stream started to play, there was audio but no video. I figured I might have a back-leveled version of Windows Media Player so I went to microsoft.com to download the latest version Media Player
A curious thing happened during the install. After going through some initial setup ("yes, I agree Microsoft is not liable for any bad things this program does to my system or data" and "no, I don't want MSN to be my default music download purchase provider"), the install program popped up a funky dialog box stating "System Restore is currently disabled on your computer. It is strongly recommended that you cancel Media Player setup, enable System Restore, and re-start setup".
For those of you not familiar with System Restore
, it was a new utility in Windows ME (carried forward to Windows XP) which allows you to "roll back" your system to a previous state. The idea is that if you install a program or new hardware device, and your system totally gets hosed, you can go back to a working version of the system, without reinstalling Windows and all of your programs. It actually has saved me several times, so I appreciate the tool.
But back to the Media Player 10 install.
The install program asked me to enable System Restore so that it could take a "snapshot" of my current (mostly) stable system state. My first reaction was "Wow, that's really responsible and forward-thinking of Microsoft to remind me to turn on System Restore before upgrading one of their programs". But then I started to worry ... no other program, Microsoft or otherwise, has ever explicitly warned me that I should enable System Restore. So what the heck is this program going to do to my system anyway?
I went ahead and enabled System Restore and installed Media Player v10. Once nice surprise was that it didn't require a reboot, which (I believe) is a first for Media Player upgrades. My system is still mostly stable, but I have a bad feeling I may be experiencing some hard core Windows entropy in the near future (I really need to add that term to the WikiWikiWeb
Oh, but to come back to the original point of the upgrade - I am able to view Jack Greenfield's lovely visage, in a little window that takes up about 1/4 of the area that it's supposed to. Something still's broken, but at least there's some video (see pic below).
Go Windows Media Player!
(Pic of my sort of working Windows Media Player. Jack Greenfield's face is actually in the black box - the lack thereof in the pic below is a print screen issue).
Sorry for the recent lack of posts. The trip back from China was really tiring. A 12 hour flight from Beijing to Chicago is quite enough, but a 12 hour flight from Beijing to Chicago with an energetic 11 month old boy is too much!
The 11 hour time change was also a shock to the body. Try it yourself - try to stay up all night, then go to sleep at 11 AM and wake up at 7 PM. It just doesn't work!
But enough complaining. My mind and body are now back on US Eastern Standard Time. Look for a new blog entry on technology and IBM later today![Read More
A Joel Spolsky blog entry
alerted me to a really cool new capability from Google, called "Google Suggest". Basically as you type it lists possible (probable?) completion terms. Check it out
I looked at the HTML source and read the Google Suggest FAQ
Google is really doing some cool stuff. Is this why all of these software industry studs like Adam Bosworth and Josh Bloch are leaving their companies and joining Google
? That's probably part of it. Stock options and a skyrocketing stock price
probably don't hurt either :-)[Read More
Grady Booch posted tonight
about a very serious personal matter.
It's amazing how easy it is to start viewing work as so important and all-consuming, then hearing about truly important things like the southern Asia disaster and now Grady's news - it's like a slap in the face from real life.[Read More
Don Box, an architect on Microsoft's next-gen communications subsystem, has posted an interesting list of 10 predictions for 2005
The one that really caught my eye was:
Sun Microsystems will embrace Eclipse
I don't think this will happen for a long time, if ever, but it's a nice thought!
I tried to use Java Studio Creator (built on NetBeans, Sun's open-source competitor to Eclipse) and couldn't really get into it, but to be fair I struggled mightily with WebSphere Studio v4 until I got the hang of the Eclipse look and feel.[Read More
You'll see on the right-nav that I've updated the "recommended book" link for January.
This month I'd like to recommend that you check out the recently published book, "Joel on Software
", by Joel Spolsky.
Joel runs a small software company and used to work for Microsoft. This book is a compendium of essays from his blog
, also called "Joel on Software", from the previous couple of years.
I enjoyed this book because of Joel's deep understanding of both the software engineering disciplines and the software industry. I also really enjoyed Joel's conversational tone and sense of humor. The topics are diverse and range from the very technical (e.g. the essay on Unicode) to very business-oriented (e.g. "How Microsoft Lost the API War"). "The Law of Leaky Abstractions" was also very interesting and I've heard people in IBM quote it.
Anyhow, if you work in the software industry and want to understand it at a deeper level, you should definitely check out this book.
Here is the table of contents
, from Joel's site.[Read More
Hope everyone had a fun New Year. This year was more subdued than most due to the tragic situation in south Asia. FYI, if you have the financial means and the personal inclination, it is very simple to donate a few dollars to the relief effort via the Amazon.com American Red Cross Disaster Relief web page
I've noticed a common theme in TV commercials related to technology is that some middle-aged manager is shocked by the implications of some new technology while some young hipster takes the new technology for granted and smugly explains its capabilities to the old-timer.
I suppose it's a sign of the Internet's ubiquity that I, and I'm sure many other people, now take it for granted. Well, I think everyone, techno-hipsters included, should step back for a minute and marvel at how much the Internet has changed your everyday life. I realize that there are billions of people who are not yet Internet-enabled, but it's a safe bet that people reading these words are online :-)
I find it mind-blowing that I can sit in an apartment in China and instantly send a long letter to my family in the United States for free (yes, there are monthly ISP costs, but they are negligible).
Even this blog is an example of the changes brought to businesses by the Internet. The ability for a company to connect with customers and other stakeholders is now a competitive necessity. Thus, my fellow IBM bloggers
and I have an unfiltered channel to the wired world, via an IBM branded web page. Can you imagine the rigidly commmand-and-control IBM of 1985 allowing this? I can't.
For those of you interested in history, here's a brief history of the Internet
. A few quick facts from this history:
- The seminal concept for what would become the Internet sprung from the mind of an MIT professor in 1962
- The Internet was not originally created to provide robust communications in the face of a nuclear attack (although this explanation is a popular urban legend)
I wonder ... 500 years from now, how will the invention of the Internet in the 1960s be compared to Gutenberg's invention of the printing press in the 1430s? My guess is that the Internet will be considered more important. Of course I can't back that up, and I won't be around to be called right or wrong.
But it's fun to speculate.[Read More
One week until I leave China and head back to the US. Today I got sort of stranded at my brother-in-law's house. It's too cold to go outside and I somehow forgot all of my books and my journal at my parent-in-law's house. No one else is around, so I find myself with the following options:
- Try to teach myself Chinese by reading my brother-in-law's Chinese version of "Effective Java"
- Watch Chinese soap operas on TV
- Cruise the net!
Well, if you're reading this you've figured out which of the above options I chose. Because reading news articles and technical articles gets tiring after a few hours, I decided to look for something a little more meaty. I found it in The Art of Unix Programming by Eric Raymond, which is available to read online
The book is really interesting and has many insights that I wasn't familiar with, which is unusual, since most books on computers tend to hit 80% of the same well-trodden topics and stories.
The book is pretty hostile to Microsoft (in a very one-sided manner) which may be a turn-off to some readers, but I guess I just look at it as the artistic license of a true believer.
One part that I've found particularly interesting is the write-up on the Unix maxim that "silence is golden
"; i.e. if your program doesn't have anything interesting to say, then don't say anything. This was one of the "features" of Unix that really caused me problems when I first started programming at Penn State University. I never knew what the hell was going on because the command-prompt would say nothing, whether I simply moved a file or accidently overwrote a programming assignment due the next day (which is funny ... in hindsight). Reading Raymond's arguments, I think that the "silence is golden" rule holds up somewhat better in a command-line world than in a GUI world. In a GUI world there are many subtle mechanisms to provide feedback without being obtrusive about it.
On a related personal note, my technically-proficient wife sometimes calls me "Mr. Unix" because I occaisionally forget to provide re-assuring "uh huh"s to statements she makes that I don't disagree with.
PS - I would personally pay $100 to watch Raymond and Donald Norman
debate the merits of software providing constant feedback to the user. But perhaps "celebrity deathmatch
" would provide a more appropriate forum.[Read More
On Christmas Eve I was in Shanghai with my wife, and we met two of her old high school classmates for dinner. It turned out that one of them works as what I'd consider a business / systems analyst for a Chinese firm that does custom software development for the telecommunications industry in China.
During dinner we started talking a little about software development. It turned out that his group gets quite a few jobs subcontracted to them from IBM China
, so he's quite familiar with WebSphere Application Server
and Rational Application Developer
Another thing he mentioned was that he found requirements gathering difficult. He mentioned that it's challenging because there are always different stakeholders across the customer's company which leads to problems such as:
- competing and conflicting requirements
- stakeholders suggesting technical solutions rather than business requirements
- different language to express the same ideas
- political turf wars
Sound familiar? It's really interesting to me that although we come from two very different cultures, the same exact problems pop up during requirements gathering.[Read More
I wrote a glowingly positive review
of Ted Neward's book, "Effective Enterprise Java
", for The Rational Edge online magazine. If you haven't read this book yet, please read the review
and see if it's something that might interest you. I really think that anyone working on enterprise Java systems (J2EE or otherwise) would benefit by reading this book.
I also recommend following Ted's weblog
as a source of insightful and (thank goodness) irreverent discussion on enterprise computing.
I'll probably do several book reviews for The Edge
in 2005. I read a lot of books, and I hope that by reviewing them, I can help folks decide how to spend their money and, more importantly, their precious time.
Yes, I am writing this email from China. It's a bit challenging because the keyboard layout is different and I can't use right-click context menus, because the text is in Chinese. There is probably an option in Windows to display in English, but I don't want to muck around with my brother-in-law's configuration.[Read More