One of the ever-present challenges in talking about complexity, even in a technical setting, is overloaded terminology. Discussions become bogged down as we talk past one another, using the same words but meaning different things.1
Systems are often characterized colloquially as being “bottom-up” or “top-down” in their structure and dynamics. Roughly, bottom-up implies that there is no “plan” or “design” that is pre-stated that the macroscopic state of the system conforms to. It is closely related with the notion of “self-organization”, in which the organization of a system is not imposed on it via some external agent, but rather determines its own organization via local interactions among components of the system.
But two kinds of “top-down” get constantly conflated, and I want to distinguish them here as in practice they are worlds apart.
The two kinds of processes that both get referred to as “top-down” might be called:
Blueprint processes
Coarse-to-fine processes
The punchline: Blueprint processes produce rigid and oversimplified systems that are unable to adapt to internal or external forces, whereas coarse-to-fine processes preserve at least the possibility of producing functional complex systems that are adaptive and sensitive to context at various scales of relevance, while fulfilling some desired purposes.
Blueprint Processes
This is the kind of “top-down” process that inevitably fails in the face of complexity. It relies on a central controller and/or representation (e.g. a ‘blueprint’) to which the world, in all of its many details, is meant to conform.
The problems arise from two related issues: (1) any controlling agent or abstract representation has finite bandwidth — it can only cover so many details —, and (2) this limits the degree of fine-grained activity and sensitivity of the system, which is necessary to satisfy small-scale dependencies and adapt complex functional dynamics in response to internal and external stressors.
This type of process leads to artifacts that:
have unresolved internal tensions and conflicts
are contextually insensitive
fail due to hidden/unknown but necessary functions being compromised
lack timely adaptiveness
In essence, they over-constrain the system’s complexity to the bandwidth to the controlling agent. As it turns out, there is a reason things in nature are so complex. It’s because they have to be in order to survive.
We can observe this kind of process in, for instance:
requirements-driven engineering
modern architecture and construction
central economic and social planning
micro-managing bosses
James C. Scott’s book Seeing Like a State argues in great detail through historical examples why these schemes fail in the context of societal and ecological systems. He opens the book describing what was called Scientific Forestry. In short, in an effort to make forestry more efficient and orderly, such that the forest could be managed centrally by the state via abstract representations, they ended up killing the forest. It turned out that all the little details in the forest that they were ignorant of were crucial to the well-being of the forest. The bugs, and the fungus, and the birds, and the shrubs, and every little thing served a role in bringing renewal to the forest — in keeping the forest alive. Not to mention all of the ways these “extra bits” were essential to the people that depended on these forests.
And, importantly, it took a century to figure that out. Long enough to tout the success of the practice, achieve massive buy-in, and spread it far and wide; effectively destroying huge swaths of ecosystems.
Early success can blind us to self-inflicted devastation that takes time to manifest. It is essentially by definition that unforeseen costs reveal themselves more slowly than known benefits. This is why we do this to ourselves over and over.
This is the “top-down” that is antithetical to complexity. Everything must be “directed”, down to each and every detail. The system must conform to the blueprint, each little thing in its “proper” place. This is the impulse of the micro-manager, the dictator, and, too often, the all-too-helpful do-gooder. The road to hell, and all of that.
This approach loves a blank slate, a smooth surface, over which it can readily impose its will. If there is not one available, it will create one, destroying whatever texture and pattern existed before, however compelling. Smoothing over the inherent roughness of the world. Simplifying it to death.
But there is another kind of “top-down”, one that, when wielded appropriately, is in alignment with generating complexity. In Part II we will discuss coarse-to-fine processes.
Even the word complexity itself is overloaded: sometimes it implies systems with a lot of interdependence, and hence with a good deal of constraint and informational redundancy, other times, as in Kolmogorov complexity, it refers to instances in which features are independent, and hence don’t have informational redundancies. More on that for another article.