Seeking Purpose
There’s a phrase that’s now become ubiquitous on various parts of the internet, and it goes like this:
The purpose of a system is what it does.
At first glance it’s a meaningless non sequitur, a sort of self-referencing zen koan seeking to make something simple seem profound. Obviously systems work as they’re designed, right? It’s tempting to discount it out of hand and move on. And yet, there’s a lot of linguistic trickery going on here that I want to unpack, and it carries far deeper implications than mere language.
Working as Intended
One easy way to read this is by conveying intent:
The purpose of a system is what it does, by design.
Given the context where this phrase has been appearing, I believe this is the primary interpretation. The concept is constantly being applied to society and the world at large at a conspiratorial level.
Feminists might say, “The world is this way because of Patriarchy!”
Anti-Semites might say, “The world is this way because of Jews!”
Maybe it’s authoritarian global elites who want us to live in pods and eat bugs. Perhaps it’s fat-cat Capitalists yearning to buy their tenth super-yacht private jet carrier, cartoonishly twirling their mustache while colluding with their peers to corner the market on badger futures to some mysteriously lucrative end. Or maybe it’s just rich misanthropists who despise humanity and will do anything to extinguish all the useless eaters. It doesn’t matter.
It’s an ultimately midwit take, replete with unearned moral or intellectual superiority. It’s also incredibly pessimistic, and does nothing but demoralize those who just want to live their lives. It’s a call to action to dismantle purportedly detrimental societal influences absent of falsifiable proof of the underlying claims.
It’s a cause for thoughtless zealotry.
Or Not
Rather than dwell on various pending catastrophes threatening to upset the social order, how about this?
Despite intentions, the purpose of a system is what it does.
That’s better, right? Like Hanlon’s Razor, it’s not malice or design that has brought us here, but incompetence, or maybe something mundane like unforeseen scope creep. Perhaps there’s some underlying design or implementation flaw which has led to an unexpected or adverse outcome.
Is that really better, though? Sure it’s less intrinsically evil, but the end result is the same. The problem here is that Complex Systems are inherently unpredictable, and we inhabit a multitude of competing complex systems that maintain a precarious global balance.
For example, during the Great Leap Forward in China, Mao launched the Four Pests Campaign to reduce disease and increase crop yields. One target of this campaign was sparrows, which had been keeping locust populations in check. The resulting famine starved 20-30 million people to death. That definitely wasn’t the intended effect, but it happened anyway.
Or what about a more recent occurrence in the United States? In an effort to end homelessness, California has spent $24 billion since 2019 on programs, yet the homeless population in California has increased by 20% over that time. A naive observer might be forgiven if they assumed that was the intended effect. After all, if you spend money on something, it’s because you want more of it.
One defense is that these programs act as a magnet to draw homeless people to California. Couldn’t the same be said for illegal immigration and the United States at large? The more money the US spends on unchecked social programs and sanctuary statutes, the more migrants it attracts? That may not be the intended effect, but again, that’s the end result.
The problem with complex systems is that they’re impossible to effectively model. There’s always an unexpected variable that derails everything, despite our best efforts and most gifted practitioners. Any attempt to control systems like this in a top-down manner usually just leads to worse outcomes, sometimes exactly opposed to the desired result.
It Defies Belief
Why? Because there’s also an aspect of willful ignorance. Consider:
Regardless of what you might believe, the purpose of a system is what it does.
I’ve also seen the sentiment stated more colloquially like this:
They don’t think it be like it is, but it do.
There’s a joke that most acts passed by US Congress should be interpreted as the opposite of their name. A popular example of this is the PATRIOT Act wish sought to grant sweeping surveillance and enforcement powers to various government agencies. Exactly what a patriot would do, I’m sure. Or the Affortable Care Act which increased medical premium costs by 69% in the first two years alone. I’m sure there’s a pending Happy Puppies Act coming soon which lovingly consigns legions of puppies directly into nearby wood chippers.
The point is that people rarely peek behind the curtain, read between the lines, or do any further investigation beyond reading the title of something. Further complicating matters is that this can inspire blind allegiance to the surface cause. The Happy Puppies Act could outline and justify all manner of puppy obliteration under the auspice it’s helping disadvantaged puppies overall, and adherents would solemnly continue running the Puppy Shred 9000 to that end. PETA, the infamous advocates for ethical treatment of animals, kills more animals than they adopt at a rate of 97.5% for dogs, and 99.6% for cats.
It’s frighteningly easy to point to the ostensibly stated goal of something and believe that will be the end result. That is, after all, what it says on the tin. What if it isn’t? What then?
But wait! There’s more!
Identical Cousins
I believe that there are two missing words which would fully disambiguate the phrase and spill all of its esoteric viscera:
The purpose of a system is indistinguishable from what it does.
The purpose of a system—what it’s supposed, intended, or designed to accomplish—is completely decoupled from the end result. Indeed, any particular system in place may have never been designed at all and could be simply represent an organic emergent framework, and is thus fulfilling its own purpose. Cause and effect no longer matter, because it’s the association we’re interested in.
I also think this is precisely why those (or similar) words are never included.
The phrase purportedly originated with British theorist Stafford Beer, who followed up with this:
There is after all, no point in claiming that the purpose of a system is to do what it constantly fails to do.
Why did I wait to bring this up so far into the article? Did I bury the lede? Beer made a name for himself by designing systems in computers, government, and theoretical modeling. He clearly wasn’t blind to the downstream effect of his work and understood the impact of simplicity. Even if he didn’t invent the concept, he made it popular by coining the term; it even has a short acronym: POSIWID. All the rest can be derived from there, as I did.
The simple and slightly jarring phrase demands further inspection, and in fact, is a victim of itself. Beer likely didn’t intend for this simplified concept to attain notoriety and be used for justification of hate or dismantling society, but that’s what happened.
In the Wild
As a Postgres database engine advocate and engineer at heart, I’ve seen this in action quite frequently even in this context. The most common example is when a new user installs Postgres and then almost unerringly finds themselves unable to connect to the service. The installer does what it’s built to do: install Postgres and maybe PgAdmin and create a database if one didn’t already exist. But there’s usually no follow-up or subsequent instructions. It’s assumed the user will read the manual and figure out how to connect, or inherently know how authentication works, and any number of cumulative systemic design flaws.
The mistake here is not in the software, but in the process. From the perspective of new users, the purpose of Postgres is to be unnecessarily confounding. This scenario is possible because Postgres has no official GUI, so PgAdmin became the popular substitute. But it’s not produced by the same team, isn’t held to the same standards, and is the frequent target of valid criticism as a result of various misalignments. Many Postgres installation packages (rightly) don’t set a superuser password, disclose which username is the superuser, or even which database names exist in the new cluster. There are fully justified purposes for all of these things, but what is the entire cohesive result? Less adoption due to initial confusion.
Whatever is happening, intended or not, is the perceived purpose of the system.
Consider the declining birth rate of nearly every country on Earth. There have been many proposed reasons for this. Maybe it’s overuse of pesticides or some other environmental toxin. Perhaps it’s related to frequent pessimistic headlines leading to excessive fear of the future to bring children into the world. It could be that education, housing, medical, food, childcare, and other costs have risen too high to feasibly raise a larger family. Or is it related to birth control? Or perhaps the breakdown of marriage? Should we be telling women that children are an untold burden and they’d be much happier chasing a career? If people are waiting longer than ever to get married, is the resulting reduction in fertile years significant? It could even be a shadowy cabal dedicated to preventing humanity from overwhelming the globe. Maybe it’s all of these things simultaneously, or none of them in favor of some other unexamined possibility.
It doesn’t matter. The purpose of this system is apparently to drive down the human population because that’s what it’s currently doing. China’s One Child policy wasn’t supposed to lead to the discarding or death of millions of female babies, but that’s what happened. The only way to correct something is to identify the underlying problem. Postgres should probably solve its new user conundrum, and we as a species really need to figure out why nobody is having babies before it’s too late.
Post Scriptum
In the end, “The purpose of a system is what it does,” ultimately resonates as a warning, not a prescription. Why is a particular system behaving pathologically? Is it enough to simply recognize it is doing so? Not really, as what looks like a fundamental flaw from the outside may be a critical operating factor.
Mao thought sparrows were such a loathsome pest that they should be exterminated. It’s one thing to understand his mistake, such as not considering input from farmers, or closely observing the ecosystem before making that determination. It’s quite another trying to influence the combined economic, sociological, and cultural influences of the entire world in any meaningful manner.
But the system is always there, doing what it does. Should we intervene, or is our constant interference inflicting this endless litany of current woes? I often hear phrases such as, “If only people would just…” and then I immediately stop listening. People are not robots. They will never “just” do anything. There is no prescribed society on Earth that will survive human variability. Trying to force people to “just do something” is the best way to ensure it will never enjoy any success in the history of our species.
No prescribed society, that is. People need to be inspired, optimistic, have a meaningful future they can look forward to, and in most cases, enjoy the company of close family and friends. We’ve torn down our close-knit social culture and replaced it with something cold and artificial, rife with automated exploitation of our instinctual drives. Then we try to fix the inevitable consequences by applying corrective models driven by corrupt or insufficient input variables. We become the awful system, with a purpose to produce more of itself. Because that’s what it’s doing.
I don’t know how to successfully escape that self-perpetuating cycle, and I’m afraid that in an effort to do something, we’ll end up undoing everything. Desperate Hail Mary passes have an incredibly high risk to reward ratio, and a correspondingly low success rate. Whatever we do, we must stop fooling ourselves: the system is working precisely as it should, we’re just atrociously wrong about what we thought it was doing.
Until Tomorrow,