One of the genres of systems I'm studying for the Handbook of Software Architecture is that of supercomputing. There are several classes of problems for which deep computing is essential: modeling the weather, protein folding, materials analysis, and so on. I've mentioned the Top 500 list before, but there are two recent additions to the list worth mentioning.
Now listed as the world's third fastest computer is the relatively inexpensive System X built at Virginia Tech, consisting of a supercluster of over 1,000 Apple Macintosh G5 machines. (It's also worth noting that G5 chip is produced by IBM). Number 73 on the list is IBM's Blue Gene prototype, the operative word here being prototype. The first production machine, being built for the Lawrence Livermore Laboratory, is expected to hit 360 terraflops, almost an order of magnitude faster than the current leader, Japan's Earth Simulator.
Being a software geek, I'm therefore natually curious as to the software architecture of such systems; I have knowledge of command and control systems, enterprise systems, robotics, games, and embedded systems, but admittedly have a blind spot in sofware for deep computing. What I have learned this far is that the ability to build large scale software for this domain is becoming an increasing bottleneck to exploiting the power of these amazing machines. FORTRAN and C still appear to be the dominant languages here, with Java starting to gain some traction.