Actually, this kind of resonance is not uncommon with complex systems. Hit a series of waves in just the right way and you'll break the back of even the most carefully engineered ship; encounter a particular combination of winds and you'll topple the tallest building. You can design for and test against most such conditions, but the state space of any complex system is such that you can't afford to design and test against all possible conditions. So, common practice is to over engineer the system, be certain that you operate it within its intended design envelope, and when things that you could not design or test for happen, patch the system so that it won't happen again. Hopefully, no one dies in the process.
Now, I'm not a financial expert, but my take is that last week's stock market jitters were a manifestation of this kind of complex system resonance. The new SEC rules that have been put into place are thus a (reasonable) systemic patch, for they attempt to break the dynamic coupling among computer trading on different exchanges, a coupling that could not have been anticipated nor tested for. I say "could not have been anticipated" because you can't a priori predict where and when these kinds of resonances will take place; "nor tested" because the highest fidelity simulation of the market is the market itself, and any lesser simulation would likely never exhibit such behavior at all. At worst, a systemic resonance such as this will tear a system apart. Having a human in the loop or having explicit governors vastly changes the dynamics because, from a control systems perspective, you have a very different frequency dynamic going on, one that will either dampen or neutralize the resonance.
What's the lesson here? Software-intensive system are not immune to such systemic problems. Indeed, as we build more systems of systems, it's increasingly likely that we will see more of these kinds of strange, intermittent behaviors. Problem is, each individual system may be operating perfectly, but it's the combination of such systems under particular operating conditions that produces resonance. When that resonance happens, there will be much finger pointing, but that won't solve anything, because it is a systemic problem. Furthermore, we really lack the analytic tools to test for much less reason about such problems in software-intensive systems. So, the lessons from systems engineering apply to software-intensive systems as well: over engineer them, operate them within particular design envelopes; patch the system when it resonates. I should add that having loosely couples systems helps a great deal too, because you give the system room to move, so to speak.
Quote of the day:
Read more at Grady’s Handbook blog