Dear Richard Budd, Matthew, Roger and all:
RICHARD B:
<<<
As I have always understood it, the "chaos" that sits below the inorganic
level is a sort of pure or "unpatterned DQ"--- the sea of DQ in which the
island of SQ resides and arises from. But I have recently been wondering if
this doesn't create a problem.
For if that chaotic "level" is equal to DQ itself then there is a clash
in the moral codes. For one code establishes "the supremacy of DQ over SQ"
and another establishes "the supremacy of the inorganic over the chaotic".
>>>
MATTHEW
<<<[Richard] makes the assumption that what proceeded the inorganic level
was chaos,
and that seems to be fairly intuitive. But I would contend that this is
actually flawed intuition. I think it would be more accurate to say that
what proceeded the inorganic level was not chaos, but order, absolute order.
Indeed, this seems to be supported be science. The entropy before the big
bang was actually zero. There was no chaos at all. All matter and energy
was contained within an infinitesimal space.>>>
Let me be a bit mischievous here and comment that "in the beginning" we may
have had a state of perfect order, but infinite potential for disorder!!!
Some of the contradiction lies in the underlying assumption of time as some
absolute primary element of existance, rather than as some illusion
empirically derived from experience. However, according to Ilya Prigogine
and other eminent physicists, the progress from order to disorder may be the
only real definition of time we have. Thus, in answering Richard's EXCELLENT
questions, we may end up arguing ourselves round in circles.
Roger has continued the discussion on randomness and I think I see progress.
For now, I will continue the dialogue, but I think that we should think
about distilling it all down to a few easily understood principles.
ROGER:
>
>..Pysicist Paul Davies defines nonrandom as a
> number or pattern that can be generated or defined in fewer bits than the
> number or pattern itself. Random is that which can't. He also says almost
> all numbers are random, but most cannot be proven random.
>
> Do you agree with this definition?
It is an pragmatic definition. It means that the data is non-compressible,
with nothing to be gained by representing it as a function. However, that is
not necessarily my definition of random. If the sequence is regarded as
random, that inplies that any other sequence would serve equally well. On
the other hand, if that sequence is regarded as information, then Davies'
non-compressibility criterion means that there is no "reduncancy" and the
string cannot be changed without lsing the semantic content.
> ROGER
> My question is on your statement that "systems tend towards maximum
entropy
> i.e. towards the state which
> offers the greatest number of degrees of FREEDOM." Is an unpatterned
state a
> higher degree of freedom? I agree it is a more likely arrangement, but a
> freer arrangement? Could you explain more?
>
"Degrees of Freedom" is the terminology that any student of statistics
should recognise. It means that you have "x" number of ways to switch the
order of the individual items of "data" without changing the "meaning".
Thus throwing an eleven with two dice has one degrees of freedom, since I
can get it with dice A=5, dice B=6 or I can swithc the order to A=6, B=5. In
contrast, throwing a twelve has no degrees of freedom, since I can only get
it one way.
If you do an experiment with a pair of dice, you will find out that the most
common results is 7, which is the result with the most degrees of freedom
(5) since there are 6 ways to get it.
One can give many similar examples using Roger's pack of cards.
> JONATHAN (from your website):
>
> The laws of diffusion and the gas laws provide a useful and definitive way
of
> describing the behaviour of populations of molecules. Yet, the "obedience"
of
> the population to those rules is no more than an expression of the totally
> random movements and collisions of individual molecules [snip]
>
> ROGER:
>
> Slow down..... Pressure, volume and temperature (and entropy) are all
> emergent statistical averages of the population based upon the specific
> context of the experiment. They are patterns that emerge from statistical
> random interaction.
YES. That's my point. The "statistical average" is a summary of the data
i.e. a pattern.
Populations considered to vary randomly are often described by averages and
standard deviations, and the actual individual scores subsequently ignored.
This is absolutely necessary since, to paraphrase Pirsig, if you can't
summarize the data, there's not much else you can do with it!.
>>But the context of the experiment is totally contrived.
> We define the experiment. We get the gas, build the container, put the
gas
> in the the container and set up the context in which volume or temperature
or
> disorder have meaning.
All experiments are contrived! But they are drawn from practical experience.
Thermodynamics came to prominence while engineers were building steam
engines. Futhermore, as I said in my essay, the formulation of the gas law
equations was entirely empirical - the thermodynamic explanations came much
later.
>
> We are creating a pattern from our experiment that emerge statistically
out
> of the unpatterned interactions reacting with our test conditions. Do you
> agree?
Yes, and by inference we use the patterns for prediction, in designing steam
engines, airbags for cars etc.
>
> JONATHAN:
>
> The above contradiction may in fact stem from two inherently contradictory
> world views, which are both incorporated in scientific theory.
> Newtons mechanics regards matter as inherently stable, following constant
> trajectories which only change in response to changes in external forces.
> Thermodynamics says that all matter has an inherent tendency to dissolve
into
> disorder unless it is somehow held back.
>
> ROGER:
>
> Hmmm...... I would agree with your take on Newton, but would add that
> complexity theory adds that our knowledge of the interactions and
> trajectories is imperfect, and hence we lose the ability to pattern or
> predict anything about individual particles. Complexity overwhelms our
> ability to pattern. However, statistics comes to our rescue and new
emergent
> qualities (volume, pressure, etc) can be patterned.
The philosophical issue is whether the limitation on our knowledge is
practical (we are overwhelmed) or theoretical (we can NEVER know enough). As
far as science is concerned, it makes no practical difference. We are always
doomed to work with a finite set of "facts" and extrapolate from there.
> JONATHAN:
>... randomness cannot be considered an objective property ...
> ROGER:
>
> Agree.
That is a key point. I hope that it is clear to everyone else
>
> JONATHAN:
>
> Einstein firmly held that there must be causal mechanisms that determine
the
> statistical distributions. [snip] Quantum
> mechanics has no need for the classical causal mechanism Einstein sought,
and
> thermodynamics has no need for a classical analysis of individual
molecular
> movements. In both cases, statistical considerations provide a perfectly
> adequate starting point.
>
> ROGER:
>
> Adequate to explain reality. But it didn't meet Einstein's definitions of
> the reducibility of a good quality theory. The highest quality
> interpretations of reality are not ultimately reductionist or
deterministic.
> Albert didn't want to accept this though did he? He somehow could accept
it
> as long as he could cling to a theoretical belief that the individual
> Brownian motion could be tracked. Quantum reality forced him to discard
that
> (probably incorrect) view. (or have I misstated something here?)
>
I'm way too modest to declare Albert wrong. Perhaps he was right, and we
will find a deeper level of understanding. Science tends to oscillate
between statistical/empirical type views (like QM) and mechanistic views.
The discovery of "atoms" the 19th century basic building blocks of matter
was underpinned with new foundations - subatomic particles (electrons,
protons and neutrons), and that now is undermined by the discovery of
Quarks. At each stage, the pendulum swings.
I personally tend to the statistical/empirical view. However, I am against
the "lazy approach" of taking things as they come (i.e. simply "collecting"
statistics); instead I believe that we are obliged to try and explain
observations by looking for deeper patterns.
> JONATHAN:
>
> ....ultimately, when it comes to deciding on which definition of cause
best
> suits the science of thermodynamics, the tautological description of
chemical
> reactions given above makes no distinction. Instead, the cause of change
> becomes an all encompassing concept which transcends any division of
meaning.
>
> ROGER:
>
> What does the last sentence mean?
>
I'm probably going have to spell that one out when I revise my essay. I was
a bit more exlicit in one of my posts.
JONATHAN 9 (21/1/2000):
<<<<There is an important thermodynamics equation that every chemistry
student learns:
deltaG = deltaG_standard - RTlog[substrates]/[products]
"deltaG" is the Gibb's free energy change; when negative, the reaction
of substrates to products is expected to go forward spontaneously with
no need for any external driving force. If deltaG is positive, the
reaction would tend to go backwards. The "deltaG_standard" in the
equation is a constant related to the equilibirum ratio of substrates
and products under "standard" conditions. At equilibrium, the reaction
proceeds neither forwards nor backwards. One never needs to assume to
know if and when a subtrate molecule will transform to product, or the
reverse. However, the aggregate tendencies of the whole population of
molecules may be predicted. Thus the equation means that the reaction
moves to change the ratio of substrate vs. product towards equilibrium.
All this can be summed using non-technical language to give a rather
amusing tautology:
"Things tend to go towards the expected state". That's what the
equation says!
That statement also nicely reconciles the apparent "deterministic" laws
of thermodynamics with the non-deterministic principle of empiricism.
>>>>
I have to stop now. I hope that the content justifies this unusually long
post.
Otherwise, I apologise.
Jonathan
MOQ.ORG - http://www.moq.org
Mail Archive - http://alt.venus.co.uk/hypermail/moq_discuss/
MD Queries - horse@wasted.demon.nl
To unsubscribe from moq_discuss follow the instructions at:
http://www.moq.org/md/subscribe.html
This archive was generated by hypermail 2b30 : Sat Aug 17 2002 - 16:00:39 BST