Normally, you would see an ad here. Maybe you are using an ad blocker. I respect that, but this site relies on revenue from ads to continue. Maybe as an alternative you would consider making a donation on the Contribute page.

Normally, you would see an ad here. Maybe you are using an ad blocker. I respect that, but this site relies on revenue from ads to continue. Maybe as an alternative you would consider making a donation on the Contribute page.

Copyright © Sirius Publishing 2006-2019

<< Previous

Next >>

Share on Facebook Share via e-mail Print

Thermodynamic Entropy

The second law of Thermodynamics states that, in a closed system, you can't finish any real physical process with as much useful energy as you had to start with, some is always wasted. Entropy is the measure of wasted energy. In other words things run down, your cup of tea goes cold, but the room increases in temperature ever so slightly. Your car burns petrol and drives the engine but the heat produced has to be channeled away by the radiator. All terrestrial energy eventually ends up as heat.

This running down what is called entropy, in this case thermodynamic entropy.



Logical Entropy

We are more interested in Entropy as used in Shannon's information theory, since Entropy is also used to mean disorganization or disorder (Shannon, 1948). J. Willard Gibbs, the nineteenth century American theoretical physicist, called it "mixedupness".

This sort of entropy is clearly different. Physical units do not pertain to it, and (except in the case of digital information) an arbitrary convention must be imposed before it can be quantified. To distinguish this kind of entropy from thermodynamic entropy, let's call it logical entropy.

In spite of the important distinction between the two meanings of entropy, the rule as stated above for thermodynamic entropy seems to apply nonetheless to the logical kind: information in a closed in a closed system becomes more disorganised over time. And really, there is nothing mysterious about this law either. It's similar to saying ‘stuff never organises itself’. Now there are instances in chaotic systems where organisation appears spontaneously, but generally there needs to be an outside influence for order (negentropy) to occur.  So what? Well, Claude Shannon stated in his seminal 1948 paper,  “an effective communication decreases entropy”.


In open systems, and all living systems are open systems, entropy can be decreased, and order increased by importing information from the environment. This is the first key feature of our model.


By importing information we mean in fact, sensing relevant changes in the ‘higher system’ or environment. When we identify an open system, it is implicit in this model, and in systems theory as a whole, that it must have some means of sensing its environment and taking in part of that environment in order to counteract the natural consequences of entropy.


We are familiar with how this works with the physical body. When we eat food we temporarily stop the body from running down by effectively importing Energy from the food we eat. This however is only a parallel to what happens in the other levels of awareness since maintaining the physical body is largely a thermodynamic process and at other levels we will be talking about the information processes. In this model it is contested that what applies to systems such as the emotions and mind is importing information from higher order systems which decreases the logical entropy in the lower systems.


Pretty obscure Yes? Importing information from higher order systems, what’s that all about. Imagine a situation where you feel obliged to buy a new pair of shoes because your friends are. You need to make a decision but are torn between fitting in with your peers or saving some money. At a physical level you might realise that you cannot afford these shoes. This probably isn’t going to stop you if you feel at an emotional level that you must fit in. But maybe at an intellectual level you just realise that you don’t particularly like these shoes. ‘It isn't you’. Importing this information reduces entropy and solves the dilemma.

Clausius invented the term Entropy in 1865 to designate the ratio of available energy over total amount of energy in any system over a finite period of time. The total amount of energy is constant in a closed system, but the amount, which is available, decreases.