Indeterminstic systems - the scourge of modern IT systems

Let me indulge in a brief history of IT over the past couple of decades.

Take your minds back twenty years ago, when object oriented programming was starting to emerge. Lots of new words, like abstraction, objects, methods, polymorphism, overloading, inheritance. Many of them were words invented for mechanisms for things that we could already do in languages such as C, but maybe with a bit more work. Pointers to functions in C, for example, when used in data structures gives you run time binding methods. Let’s face it, in the early days of C++, source code was translated to C before compiling with a standard C compiler.

RAD (rapid application development) was the rage fifteen years ago. This turned out to be nothing more than a marketing TLA.

TCO and ROI have been the IT management buzz TLAs for the past ten years. But someone please tell me: once the management consultant has convinced the management that this great new system will result in a projected ROI of eighteen months and a 20% reduction in TCO, where exactly is that management consultant eighteen months later? That’s right, he’s buggered off to peddle the latest management consultant mantra somewhere else. And does anyone go back and check if ROI and TCO was on target? You bet they didn’t.

Change. That’s the last five year’s MBA word. Apparently change is good. Good for management consultants that’s for sure. Even branding consultants have jumped on the bandwagon: if you don’t have a company mission statement or motto with “Change” in it, you’re just not trendy. And for IT, that means Change Control, a way to procedurise common sense. Now Change Control is a necessity, but all to often it’s the whipping boy to blame when things go wrong, and the next thing you know there’s yet another procedure that’s been instigated because something else went wrong. Procedure’s all very well, but it dumbs down people. No-one has responsibility anymore because it’s the procedure’s fault. Or because there isn’t a procedure. Look, we all make mistakes, and we learn from mistakes. That’s what gives us “intelligence”. So don’t patronise us all by spoon feeding more and more procedures. Some of us have grey matter you know.

Back to IT. First it was Java, then it was C#. Extending on the object oriented mantra of layering technologies, obfuscating what’s going on under the hood, we now have two things that to me seem totally at odds with everything I grew up with: manage your limited resources with care and keep response times consistently quick.

Two things with Java and C# technologies will destroy deterministic behaviour in response times and resource usage, namely garbage collection and JIT compilation. No longer can you check your basic system counters for memory usage and know whether or not you have a resource leak of some kind. Oh, I forgot, the programming runtime environment deals with all that…

So, although garbage collection (a technology of the BASIC interpreters of the 1960’s) saves lazy programmers to some degree from memory management, it adds an additional level of indirection when addressing, makes apparent application memory usage go crazy, and, when a GC occurs seemingly at random, your system slows down.

JIT compilation’s another area that I just don’t get. Sure, CPU’s relatively cheap but if you happen to run some non-trivial application for the first time since rebooting, you’ll have to suffer a recompile, and that might take some time. I don’t get why the code can’t be JITted when it’s installed. And as almost all C# code is destined for WINTEL platforms, why is JIT needed at all? Oh, hang on, there might be some weird SSE instruction that might be needed. OK, but once in a blue moon I’d suggest. So make JITting occur at install time, not at runtime.

Indeterminism of resource usage and response times are difficult things to fix in a production environment, and lead to extensive end user angst. The application programmers are so high up the food chain that they have no clue how an app will perform in the real world. And what are the upsides of using Java and C# exactly? Do we see improved development time? Reduced costs? You bet we don’t.

Leave a Reply