[Remember that even though I work for IBM I am an individual with my own thoughts and ideas. Anything I write here may not necessarily represent the views of the IBM Corporation or its partners... though I'm hoping that's only a matter of time before they catch up.]
It is hard to imagine a world without C. It is such a fundamental part of computing today, the foundation of many things that you use. Every deeply technical developer that I've dealt with has some C chops, and many prefer it. Linux, most of the GNU tools and many other software components that you use are written in C. It was a groundbreaking departure from the sort of low-level program that was demanded from computing in the early 80s and it changed everything.
The Tiobe Programming Community Index tracks the popularity of programming languages according to a poll of developers. Java has fallen recently and C has risen, bringing them almost neck to neck. Not bad for something that many would consider "old school" programming!
What makes C so relevant? Part of it is its legacy. Much of the foundations of computing, such as operating system elements are developed in C, so it's vital if you want to work with things at a low level. But the thinking behind C makes room for elegant, portable, fast, maintainable code. That's pretty good stuff! (Developers can preempt all the benefits and create chunky, non-portable code; but you wouldn't do that, would you?)
Ironically, many of the things that developers choose over C are actually built on C, including Java, Python, Perl and others. The virtual machines, the compilers and interpreters are generally written in C.