What is certainly true is that the road to complexity hiding provided the developer with abstract ways to compose ever larger "blocks" of functionality in short time and in easy and intuitive ways. First modules and layered organization; then object orientation; and more recently visual languages, services, components, aspects, and models, provided the developer with tools to compose and orchestrate highly powerful and sophisticated software systems in a relatively short amount of time.
But... there is a but regrettably. First of all, though hidden, still such complexity is part of the overall system being developed. And secondly, as it's become so easy to deal with complexity, more and more functionality is being put in place. In other words, software (and in general, computer systems) have become sort of black holes of complexity: they attract more and more complexity that simply disappears from our sight although makes the system more and more "heavy" — hard to predict and control.
Across the system layers, a complex and at times obscure “web” of software machines is being executed concurrently by our computers. Their mutual dependencies determine the quality of the match of our software with its deployment platform(s) and run-time environment(s) and, consequently, their performance, cost, and in general their quality of service and experience. At our behest or otherwise, a huge variety of design assumptions is continuously matched with the truth of the current conditions.A hardware component assumed to be available; an expected feature in an OSGi bundle or in a web browser platform; a memory management policy supported by a mobile platform, or ranges of operational conditions taken for granted at all times — all are but assumptions and all have a dynamically varying truth value. Depending on this value our systems will or will not experience failures. Our societies, our very lives, are often entrusted to machines driven by software; weird as it may sound, in some cases this is done without question — as an act of faith as it were. This is clearly unacceptable. The more we rely on computer systems — the more we depend on their correct functioning for our welfare, health, and economy — the more it becomes important to design those systems with architectural and structuring techniques that allow software complexity to be decomposed, but without hiding in the process hypotheses and assumptions pertaining e.g. the target execution environment and the expected fault- and system-models.
How to deal with this problem is still a matter of discussion. My idea is that it should be made possible to express, manage, and execute "probes" on the dynamic values of our design assumptions. While the "black hole" would remain largely hidden, those probes would help "enlight" on the likelihood that our hypotheses are actually met by the current conditions of the system and its deployment environment. A possible way to organize those probes could be maybe that of a distributed organization of cooperating autonomic digital entities, each of them representing a different unit of information encapsulation: a layer, an object, a service, etc., mimicking the structure and organization of the corresponding entities. A fractal social organization of said probes could provide with autonomic ways to deal with ever growing amounts of complexity without reaching the event horizon of unmanageability.
The Black Holes of complexity by Vincenzo De Florio is licensed under a Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License.
Permissions beyond the scope of this license may be available at http://win.uantwerpen.be/~vincenz/.