From: Scott Roberts (jse885@earthlink.net)
Date: Sun Sep 19 2004 - 22:12:30 BST
Mel,
> Scott said:
> > My guess is that the individual animal does not model the physical,
rather
> > it is done for them by what we call instinct. So the bigger picture, for
> > example as conjectured by Rupert Sheldrake's morphogenetic forms, is
> > intellect: the animal perceives a physical picture, and instinct, using
> its
> > model, tells the animal what to do, but how the whole pattern plays out
> > will be fed back to instinct to improve its model, for the individual
> > animal (hence it can learn) and for the species (hence it can evolve).
>
> mel:
> Animals model beyond instinct - predators must learn skills
> beyond what is innate, which is one reason why many species
> of predators have such high mortality.
[Scott:] Didn't I mention the individual animal learning? Like us. Once we
have learned to ride a bicycle, we say our ability is instinctual -- we do
it without conscious thinking, but that doesn't mean there isn't
subconscious information processing going on.
> Scott said:
> > One has language when one has Peirce's thirdness: a semiotic event
> involves
> > three nodes: the manifested sign, that which the sign refers to, and an
> > interpretant, as he calls it: that which connects the sign to its
> referent.
> > You cannot make a thirdness out of seconds (two things colliding, for
> > example). Therefore, since there is thirdness, there must have always
been
> > thirdness (unless one invokes God to create it out of nothing).
>
> mel:
> Don't mistake the formulation of semiotics as a field
> of study, specifically of symbols, which is just one type of
> highly attributional meaning with the far more common and
> older function of "mapping meaning".
As Peirce used the term, one has a semiotic situation whenever one has
meaning, also known as value, so he is not just talking about human
language. It was later, in the 1930's, that the term came to be used as a
field of study.
>
> For most of what passes in consciousness, semiotics
> is irrelevant, but the perception of meaning and the
> processing of data into information IS a measure of
> the continuum of stored effect...
See above: any situation involving meaning, which is any situation
according to the MOQ, is a third, is semiotic. Otherwise, it would have no
value.
> > > mel previous:
> > > It's just that runaway intellection is a bit like
> > > runaway cellphone use or conversational
> > > chatter, it gets in the way of the movie.
> > > Philosophy is one brand of cellphone.
> >
> Scott said:
> > On the contrary, philosophy, like science and meditation, is attempting
to
> > control intellect, so that it does not run away. As intellectuals, we
are
> > all beginners. The intellectual level is new, only two and a half
millenia
> > old. Occasionally we get glimpses of how it is supposed to be, and we
call
> > those glimpses genius, but for most of us, there is a lot of
disciplining
> > ahead.
>
> mel:
> Not "on the contrary", you've simply pivoted into another
> direction of discussion.
You said that philosophy is "one brand of cellphone", and cellphone use
gets in the way of the movie. I say that philosophy is the movie, or one of
them. Even if one disagrees with my view that Intellect is all-pervasive,
in the MOQ, intellect is the fourth and currently highest level of SQ. So
Quality evolution should be directed at improving and refining intellect,
that it is the moral thing to do.
> Scott said:
> > No. No matter how complex a set of interconnections, one cannot, from
that
> > alone, get awareness. It also requires violating the rules of space and
> > time. Unless that happens, there is nothing that can be aware of
anything
> > larger than itself, or one blip from something else (in fact, one cannot
> > even get that, since the single thing (e.g., an electron) needs a way to
> > combine the state of not receiving the blip with the state of receiving
> the
> > blip). So if the brain is considered to be all and only its neurons and
> > synapses, and if one limits oneself to their spatio-temporal activity,
one
> > cannot get associations, or mappings, or anything. You can get all sorts
> of
> > patterns, but you cannot get awareness of those patterns. Now the brain
is
> > biological, so there could well be some goings-on that allows the
> violation
> > of spacetime rules, but once admit that, then there is no reason to
assume
> > any emergence doctrine.
> mel:
> 1)You've let the definition of awareness slip through your
> fingers from earlier. Elvis has left the building...
Umm, what definition of awareness. I don't have one. And I don't get the
Elvis bit.
> 2)The complex of neuronal connection does not cause
> awareness, but does support an ever more complex capacity
That's what my metronome analogy tried to say. So what does cause
awareness, or are we in agreement that nothing causes it, that it is
fundamental? If so, why all this other appeal to emergence?
> 3) no rules of "space or time" are violated -- odd comment,
> why did that pop up?
Naive rules, as used in physics, that time is a continuous succession of
point-instants. Awareness of a succession cannot occur if this were true.
> 4) Reductionism as above fails in the analysis of the complex.
So say the nonreductive physicalists. I'm not sure if you are one or not,
but you are certainly sounding like one. My view is why be a physicalist at
all, and if you're not it is unnecessary to use any appeal to complexity at
all.
> 5) As you allude, all systems operate within a meta-system;
> to perseive any number of dimensions in an array, for example
> you must have one more degree or dimension than is to be
> manipulated.
> 6) Don't confuse the mind with the brain.
I certainly am not. In fact, my original comment here was precisely to show
that the mind cannot be the brain (considered as a spatio-temporal
thing/process)
> 7) emergence is simply unanticipated complexity of behavior
> arising from a comparatively simple rule set. It's not a doctrine
> of some magical sort. The games of chess and GO are
> emergent games, checkers and tic-tac-toe are not.
> (brains are also...big time.)
Spoken like a nonreductive physicalist, unless your "comparatively simple
rule set" is non-spatio-temporal, is semiotic, has value. Do you think a
digital computer can be conscious if, say, the spatio-temporal activity of
each neuron in a human were duplicated in silicon? I argue that that is
impossible, since every event in such a machine (every on/off switch) is
separated by space and/or time from every other one. There can be no
conscious phenomenon, since nothing bigger than an on/off signal can be
grasped at once.
- Scott
MOQ.ORG - http://www.moq.org
Mail Archives:
Aug '98 - Oct '02 - http://alt.venus.co.uk/hypermail/moq_discuss/
Nov '02 Onward - http://www.venus.co.uk/hypermail/moq_discuss/summary.html
MD Queries - horse@darkstar.uk.net
To unsubscribe from moq_discuss follow the instructions at:
http://www.moq.org/md/subscribe.html
This archive was generated by hypermail 2.1.5 : Sun Sep 19 2004 - 22:36:16 BST