What is the entropy of a system organized for a purpose 11-7-13

2015-01-13

++ What is the entropy of a system organized for a purpose 11-7-13

How should one determine whether a system is highly organized/ordered/complex/sophisticated?

Researchers and mathematicians often use the notion of entropy to measure order. However, entropy has its limitations and does not always match one's intuition. Let's say we are evaluating the entropy of strings that describe some system. The entropy of a simple/uninteresting string of all 0s like "00000000000000000000" has a very low entropy, and some might consider this to be "ordered". However, any observer would generally consider some system adapted to accomplish some purpose (such as a living cell) to be highly ordered, and no such system could ever be described by such a simple string. On the other hand take a string generated by a random number generator like "11101011011111100000". This string on the other hand has a very high entropy, but is completely random. A completely random system also does not match our intuition about what a system adapted to accomplish some purpose should look like. Therefore, neither the minimum or maximum entropy state of a system matches a description of life. Another possible string could be "01101101101101101101". This string clearly has some pattern or meaning (although still a very simple pattern), and this string would have some intermediate entropy inbetween the min and max entropy. Is there some measure that would be maximal for organized interesting patterns?

Are there other measures of complexity? Another measure of complexity is Kolmogorov complexity. The Kolmogorov complexity is quantified by the number of bits of the smallest program needed to regenerate the original string. For "00000000000000000000" the program could be something like "all 0" and the resulting Kolmogorov complexity would be very low. For the random sequence "11101011011111100000" the Kolmogorov complexity would be very high since there is no program that can be specified which is shorter than just printing the original message itself: "11101011011111100000". The string with a pattern ("01101101101101101101") would be very low since this string can be described by a short program such as "repeat 011". Therefore, systems can have a very low Kolmogorov complexity, but a high entropy. For example, many fractals and cellular automata can be described by a short algorithm that can produce a complicated resulting pattern which would have a high entropy. These examples demonstrate that Kolmogorov complexity can be low in a simple/uninteresting example ("00000000000000000000"), can be low in another example that is slightly less uninteresting ("01101101101101101101"), and can be low with complex patterns such as some fractals.

Is there a measure that would indicate whether a system was organized? For example, a measure that would be low for simple/uninteresting systems, low for random systems, and high for organized systems? Let us refer to this hypothetical measure as "sophistication". Here is a table that describes these different values.

(flat rock) (life, designed systems, some cellular automata, some fractals) (static)
Measure simple/uninteresting organized pattern random
entropy low high high
kolmogorov complexity low low high
sophistication low high low

How could one determine a simple metric for sophistication? We should avoid basing this measure on Kolmogorov complexity since Kolmogorov complexity is often not easy to quantitate for any given system or string. For example, one could be given the digits of pi. However, if one did not know that the pattern was pi, one might not be able to recognize any clear pattern in the digits despite the fact that pi can be defined by a simple program (the circumference of a circle divided by the diameter). Therefore, maybe there is a way to base the sophistication measure on entropy which can be easily quantified for any given string. Perhaps we can use the entropy of common systems adapted for a purpose to define the maximum point of sophistication. How could we determine such a value?

One mathematical system that has been discovered in many systems adapted for a purpose is the small-world scale-free network. Social networks, internet cables, transportation systems, gene networks, and protein networks are all examples of small-world scale-free networks. Small-world indicates that one can travel from one point in the network along the paths to any other point in the network in very few steps. Not all networks are arranged in this manner since there are certainly some networks which would require one to travel along many paths before reaching another point. Scale free networks look self-similar at many different levels. If one were to examine 100 nodes, the pattern of connections would look similar to the pattern of connections when one were to zoom out and look at the larger hubs when looking at 1,000 nodes. This type of network is efficient in the sense that it connects all of the points with a small number of paths in a manner which would allow one to travel from one point to another in a small number of steps. One could calculate the entropy of any network by determining the frequency of connections of all the nodes. With our previous examples of 0s and 1s, we would use the number of 0s and 1s to determine the entropy. With the network example, we would determine the number of points with 0 connections, 1 connection, 2 connections, etc.

The entropy of the optimal small-world scale-free network may be the entropy of systems adapted for a purpose (such as life and the systems that life designs) in general. Such systems would likely fall within a range of certain entropy values. Therefore, a measure of sophistication could be set so that it is maximal when it equals the entropy of a small-world scale-free network. Perhaps one could assess how optimal/organized/healthy/interesting a system is by determing the value of the sophistication, with a high value being optimal.



see also