Stability Of Naturally Selected Genes

Table of Contents
It is by avoiding the rapid decay into the inert state of ’equilibrium’ that an organism appears so enigmatic; so much so, that from the earliest times of human thought some special non-physical or supernatural force (vis viva, entelechy) was claimed to be operative in the organism, and in some quarters is still claimed.
How does the living organism avoid decay? The obvious answer is: By eating, drinking, breathing and (in the case of plants) assimilating.
The technical term is metabolism. The Greek word (EtaaAAEtv) means change or exchange. Exchange of what? Originally the underlying idea is, no doubt, exchange of material. (E.g. the German for metabolism is Stoffwechsel.) That the “exchange of material should be the essential thing is absurd.
Any atom of nitrogen, oxygen, sulphur, etc., is as good as any other of its kind; what could be gained by exchanging them? For a while in the past our curiosity was silenced by being told that we feed upon energy.
In some very advanced country (I don’t remember whether it was Germany or the U.S.A. or both) you could find menu cards in restaurants indicating, in addition to the price, the energy content of every dish. Needless to say, taken literally, this is just as absurd. For an adult organism the energy content is as stationary as the material content. Since, surely, any calorie is worth as much as any other calorie, one cannot see how a mere exchange could help.
What then is that precious something contained in our food which keeps us from death? That is easily answered. Every process, event, happening - call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on.
Thus a living organism continually increases its entropy - or, as you may say, produces positive entropy - and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy - which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy.
Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive.
What Is Entropy?
Entropy is not a hazy concept or idea, but a measurable physical quantity just like:
- the temperature at any point of a body
- specific heat of any substance.
At the absolute zero point of temperature (roughly - 273°C) the entropy of any substance is zero. When you bring the substance into any other state by slow, reversible little steps (even if thereby the substance changes its physical or chemical nature or splits up into two or more parts of different physical or chemical nature) the en tropy increases by an amount which is computed by dividing every little portion of heat you had to supply in that procedure by the absolute temperature at which it was supplied and by summing up all these small contributions.
To give an example, when you melt a solid, its entropy increases by the amount of the heat offusion divided by the temperature at the melting-point.
You see from this, that the unit in which entropy is measured is cal./oC Uust as the calorie is the unit of heat or the centimetre the unit of length).
The Statistical Meaning Of Entropy
I have mentioned this technical definition simply in order to remove entropy from the atmosphere of hazy mystery that frequently veils it. Much more important for us here is the bearing on the statistical concept of order and disorder, a connection that was revealed by the investigations of Boltzmann and Gibbs in statistical physics.
This too is an exact quantitative connection, and is expressed by
entropy == k log D,
where k is the so-called Boltzmann constant ( == 3'29 8 3. 10- 24 cal./oC), and D a quantitative measure of the atomistic disorder of the body in question.
To give an exact explanation of this quantity D in brief non-technical terms is well-nigh impossible. The disorder it indicates is partly that of heat motion, partly that which consists in different kinds of atoms or molecules being mixed at random, instead of being neatly separated, e.g. the sugar and water molecules in the example quoted above.
Boltzmann’s equation is well illustrated by that example. The gradual ‘spreading out’ of the sugar over all the water available increases the disorder D, and hence (since the logarithm of D increases with D) the entropy. It is also pretty clear that any supply of heat increases the turmoil of heat motion, that is to say, increases D and thus increases the entropy; it is particularly clear that this should be so when you melt a crystal, since you thereby destroy the neat and permanent arrangement of the atoms or molecules and turn the crystal lattice into a continually changing random distribution.
An isolated system or a system in a uniform environment (which for the present consideration we do best to include as a part of the system we contemplate) increases its entropy and more or less rapidly approaches the inert state of maximum entropy. We now recognize this fundamental law of physics to be just the natural tendency of things to approach the chaotic state (the same tendency that the books of a library or. the piles of papers and manuscripts on a writing desk display) unless we obviate it. (The analogue of irregular heat motion, in this case, is our handling those objects now and again without troubling to pu t them back in their proper places.)
Organization Maintained By Extracting ‘Order’ From The Environment
How would we express in terms of the statistical theory the marvellous faculty of a living organism, by which it delays the decay into thermodynamical equilibrium (death)? We said before: ‘It feeds upon negative entropy’, attracting, as it were, a stream of negative entropy upon itself, to compensate the entropy increase it produces by living and thus to maintain itself on a stationary and fairly low entropy level.
If D is a measure of disorder, its reciprocal, riD, can be regarded as a direct measure of order. Since the logarithm of II D is just minus the logarithm of D, we can write Boltz- mann’s equation thus:
- (entropy) == k log ( riD) .
Hence the awkward expression ’negative entropy’ can be replaced by a better one: entropy, taken with the negative sign, is itself a measure of order.
Thus the device by which an organism maintains itself stationary at a fairly high level of orderliness ( == fairly low level of entropy) really consists in continually sucking orderliness from its environment. This conclusion is less paradoxical than it appears at first sight.
Rather could it be blamed for triviality.
In the case of higher animals we know the kind of orderliness they feed upon well enough, viz. the extremely well-ordered state of matter in more or less complicated organic compounds, which serve them as foodstuffs.
After utilizing it they return it in a very much degraded form - not entirely degraded, however, for plants can still make use of it. (These, of course, have their most powerful supply of ’negative entropy’ in the sunlight.)
NOTE TO CHAPTER 6
My remarks on negative entropy have been opposed by my physicist colleagues.
Negative entropy is different from free energy.
The relation of negative entropy to Boltzmann’s order-disorder principle is less easy to trace than for entropy and ’entropy taken with a negative sign’ which is not my invention.
It happens to be precisely the thing on which Boltzmann’s original argument turned.
But F. Simon has pointed out that my simple thermodynamical considerations cannot account for our having to feed on matter ‘in the extremely well ordered state of more or less complicated organic compounds’ rather than on charcoal or diamond pulp.
He is right.
A piece of un-burnt coal or diamond, together with the amount of oxygen needed for its combustion, is also in an extremely well ordered state, as the physicist understands it.
If you allow the reaction, the burning of the coal, to take place, a great amount of heat is produced.
By giving it off to the surroundings, the system disposes of the very considerable entropy increase entailed by the reaction, and reaches a state in which it has, in point of fact, roughly the same entropy as before.
Yet we could not feed on the carbon dioxide that results from the reaction.
And so Simon is right in pointing out to me that actually the energy content of our food does matter.
So my mocking at the menu cards that indicate it was out of place. Energy is needed to replace not only the mechanical energy of our bodHy exertions, but also the heat we continually give off to the environment.
We give off heat is not accidental, but essential.
This is precisely the manner in which we dispose of the surplus entropy we continually produce in our physical life process.
This suggests that the higher temperature of the warmblooded animal enables it to get rid of its entropy faster so that it can afford a more intense life process.
On the other hand, many warm-blooders are protected against the rapid loss of heat by coats of fur or feathers.
So the parallelism between body temperature and ‘intensity of life’ may have to be accounted for more directly by van’t HoWs law, mentioned on p. 65.
The higher temperature itself speeds up the chemical reactions involved in living.
(That it actually does, has been confirmed experimentally in species which take the temperature of the surroundings.)