Orrin Judd cites an article about the value of habit and custom in human affairs. This is something that Hayek noted as well. The acquired structure of society is a repository of a vast amount of information. It contains the results of lessons learned at great expense over thousands of years.
But I’m not going to write about that directly. Instead, I’ll ask, if custom and habit represent stores of information, the results of lessons learned — what, exactly learned those lessons? Not any particular person. Few of our social institutions were designed in any meaningful way. Instead they were selected by trial and error as better than the alternatives. Moreover, there was never a single person in which all this information resided - instead it was distributed across all members of the society in varying amounts.
One could think of this set of social structures as something in itself by shifting the frame of reference from individual humans to groups of humans with a common culture. Such entities compete against each other, both directly (conflict between groups) and indirectly (“subversion” of humans in another culture to a new one).
There are many direct, observable effects of these information structures. Does that make them real? It’s an interesting question as to much such things can be said to exist. I ascribe them at least some level of reality since there are observable effects1.
The summing point is that these social structures are a good example of “emergent” structures, ones that can’t be expressed directly in terms of constituents elements yet arise when there is sufficient complexity in those elements. Put a bunch of humans together and you’ll get some sort of social structure. It’s similar to how a capitalist economy emerges from the individual acts of trade. It’s also the basis for some artificial intelligence theories, that intelligence can emerge from a collection of complex yet unintelligent components. I’m not going to argue that my discoursion above proves that. I offer it only as an illustration of what AI researches mean by “emergent structures”.
1 This is one of the key insights and pitfalls of post-modernism. It recognizes these structures but then confuses them with physical reality. They lose track of the fact that all of these layers exist simultaneously, just like chemistry is really a higher level reality on top of particle physics, and biology in turn is a higher level reality on top of chemistry. The higher layers don’t invalidate or make irrelevant the lower layers just as the existence of higher layers of social reality don’t invalidate the fact that such layers depend on the lower layers. It’s very similar to the relationship between software and hardware in a computer. The physical world and the humans that inhabit are the hardware on which our social “software” runs. And while software is very flexible, the hardware platform does affect it and what programs are possible. The nature of physical reality and human reality limits the range of workable social structures in the same way. This is the fact that the post-modernists have forgotten.