Feeds:
Posts
Comments

Archive for February, 2012

What is “meaning” in questions such as: what is the meaning of life? It is the same as asking what is the truly real significance of life. Any answer is only theoretical.  Intuitively, any answer must be universal.  The truly real significance must, by definition, be significant for everyone.

That makes the notion appear to be either exaggerated or rather improbable.  The universality of such a theory of meaning would rest on the multitude of “real” things that are perceived by the theory as salient, pertinent properties and relations in “real life” and to humanity in general.  It would have to include everything we can imagine in experience.  How could it be possible?

This would also make it necessary to correspond with every “real” experience, in just enough (and no more) dimensions, necessary to make such experience “really” meaningful.  Intuitively, it must capture or cover any continuous or discrete distributions or extensions of “real” natural structure, elements or processes, in three dimensions of space and one dimension of real presence or immediate existence x.

It is very complex but not impossible.  On the one hand, one cannot help but wonder how to deal with such complexity.  On the other hand, we notice that very young children do it. Four-year old children seemingly adapt to complexity, with very little problem.  It is sophistication and obfuscation that comes later in life with which they have problems.  At four, children are already able to tell the differences in sensible and nonsensical distributions and extension of reality,  irrespective of whether they are the continuous or discrete variety.

These continuous or discrete distributions and extensions bear some additional explanation mainly due to the overarching significance to this context. First, they establish a direct correspondence with our most immediate reality. For every time we open our eyes, we see a real distribution of colored shapes.  Such a real distribution is nature’s way of communicating its messages to consciousness, via real patterns.

Second, perceived distribution patterns directly suggest the most fundamental ontological concept in theoretical physics: a field configuration, which in the simplest example of a scalar field can be likened to a field of variable light intensity.  That life is intense and that meaning is intense is not something one ought to have to prove to anyone. I will come back to intensity in another post, as I want to continue commenting on presence or real and immediate existence x. We must, in practice and in effect, solve for the real meaning of x as you see.

Meaning in this case, so defined, is literally the significance of truth, or more appropriately, what one interprets as significant or true within the dimensions of intense messages or information pertaining to real life as specified above. So, we must begin, undoubtedly, by defining what true is, then proceeding to the next step, we ought define the elements and structure to one’s interpretation of this truly significant nature of life. I did it a little backwards in this respect and this has always created a bit of a confusion that I did not see until recently.

One begins any such analysis by examining a subject’s real elements and structures. For the subject of truth, one also searches the literature where it is well represented. Such a search conducted on the subject of truth brings a broad range of ideas. To try and make a taxonomy of ideas from the varied opinion found there would turn out to be an exercise in incoherence, But it ought be acceptable to reference some theories and practices that have been adopted.

Ibn Al-Haytham, who is credited with the introduction of the Scientific Method in the 10th century A.D., believed, “Finding the truth is difficult and the road to it is rough. For the truths are plunged in obscurity” (Pines, 1986, Ibn al-Haytham’s critique of Ptolemy. In Studies in Arabic versions of Greek texts and in medevial science, Vol. II. Leiden, The Netherlands: Brill. p. 436). While truths are obscured and obfuscated; there can be no doubt that the truth does exist and the truth is there to be found by seekers. I do not accept views or opinions that the  average layman is too stupid or are otherwise not equipped to figure it out by themselves.

The Modern Correspondence Theory of Truth.

While looking for the truth it helps to know what shape it takes or what it may look like when one happens upon it or finds it lying around and exposed to the light. According to some: truth looks like correspondence between one thing or element and another, Scientist have long held a correspondence theory of truth. This theory of truth is at its core an ontological thesis.

It means that a belief (a sort of wispy, ephemeral, mostly psychological notion) is called true if, and only if, there exists an appropriate entity—a fact—to which it corresponds. If there is no such entity, the belief is false. So you see, as we fixate on the “truth of a belief” –a psychological notion such as a thought of something —to be sure —but some concrete thing, nonetheless, we see that one thing —a belief— corresponds to another thing —another entity called a fact. The point here, is that both facts and beliefs are existing, real entities — even though they may also be considered to be psychological or mental notions — beliefs, ideas –they– are reality.

While beliefs are wholly or entirely psychological notions, facts are taken to be much stronger entities. Facts, as far as neoclassical correspondence theory is concerned, are concrete entities in their own right. Facts are taken to be composed of particulars and properties and relations or universals, at least. But universality has turned out to be elusive and the notion is problematic for those who hold personal or human beliefs to be at the bottom of truth.

Modern theories speak to “propositions” which may not be any more real, after all. As Russell later says, propositions seem to be at best “curious shadowy things” in addition to facts. (Russell, Bertrand, 1956, “The philosophy of logical atomism”, in Logic and Knowledge, R. C. Marsh, ed., London: George Allen and Unwin, 177-281. Originally published in The Monist in 1918. , p. 223) If only he were around here now; one can only wonder how he might feel or rephrase.

In my view, the key features of the “realism” of correspondence theory are:

  1. The world presents itself as “objective fact” or as “a collection of objective facts” independently of the (subjective) ways we think about the world or describe or propose the world to be.
  2. Our (subjective) thoughts are about the objective fact of that world as represented by our claims (facts) which, presumably, ought be objective.

(Wright (1992) quoted at the SEP offers a nice statement of this way of thinking about realism.) This sort of realism together with representationalism is rampant in the high tech industry.  Nonetheless, these theses are seen to imply that our claims (facts) are objectively true or false, depending on the state of affairs actually expressing or unfolding in the world.

Despite the fact of one’s perspective, metaphysics or ideals, the world that we represent in our thoughts or language is a socially objective world. (This form of realism may be restricted to some social or human subject-matter, or range of discourse, but for simplicity, we will talk only about its global form as related to realism above.)

The coherence theory of truth is not much different than the correspondence theory in respect to this context. Put simply, in the coherence theory of truth: a belief is true when we are able to incorporate it in an orderly and logical manner into a larger and presumably more complex web or system (sic) of beliefs.

In the spirit of American pragmatics almost every political administration since Reagan has used the coherence theory of truth to guide national strategy, foreign policy and international affairs. The selling of the War in Iraq to the American people, is a study in the application of the coherence theory of truth to America’s state of affairs as a  hegemonic leader in the world.

For many of the philosophers who argue in defense of the coherence theory of truth, they have understood “Ultimate Truth” as the whole of reality. To Spinoza, ultimate truth is the ultimate reality of a rationally ordered system that is God. To Hegel, truth is a rationally integrated system in which everything is contained. To the American Bush dynasty, in particular, to W.: truth is what the leaders of their new world order say that it is.  To Adi, containment is only one of the elementary processes at work creating, enacting (causing) and (re)enacting reality.

Modern scientists break the first rule of their own skepticism by being absolutely certain of information theory.

Let me be more specific.  Modern researchers have settled on a logical definition of truth as a semantic correspondence by adopting Shannon’s communications theory as “information” theory. Those object-oriented computer programmers who use logic and mathematics; understand truth as a Boolean table and as correspondence as per Alfred Tarski’s theory of semantics.

Modern computer engineers have adopted Shannon’s probabilities as “information theory” even though, on the face of it: the probabilities that form such an important part in Shannon’s theory are very different from messages; which stand for the kinds of things we most normally associate with objects. However, to his credit, the probabilities on which Shannon based his theory were all based on objective counting of relative frequencies of definite outcomes.

Shannon’s predecessor, Samuel Morse, based his communication theory, which enhanced the speed and efficiency with which messages could be transmitted, on studying frequently used letters. It is the communications theory I learned while serving in the United States Army. It was established by counting things — objects in the world — the numbers of different letter-type in the printer’s box.

When I entered the computer industry in 1978, I was somewhat astonished that Shannon’s theory of communications was already established in the field of information science — before word processors and “word” processing were common. I confirmed that belief by joining with information scientists for awhile, as a member of the American Society of Information Science (ASIS).

While at ASIS, I found out that Shannon’s probabilities also have an origin in things much like Morse code: although they in no way ought be considered to be symbols that stand for things. Instead, Shannon’s probabilities stand for proportions of things in a given environment.

This is just as true of observationally determined quantum probabilities (from which Shannon borrowed on the advice of the polymath John Von Neumann) as it is for the frequencies of words in typical English, or the numbers of different trees in a forest, or; the countable blades of grass on my southern lawn.

Neither Morse Code, nor Shannon’s Communications theory, nor any “information” theory, directly addresses the “truth” of things in or out of an environment –save Adi’s. The closest any computer theory or program gets to “interpretation” is by interpreting the logical correspondence of statements in respect to other statements — both with respect to an undefined or unknown “meaning” — the truth or significance or unfolding of the thing in the world. It takes two uncertainties to make up one certainty according to Shannon and Von Neumann– who had two bits of uncertainty, 1 and 0, searching for, or construing, a unity.

That is not us. That is not our scientific program. Our program was not to construe a unity, or “it” from “bit.”  That is the program of the industry, because, almost like clocks, everyone in industry marches in lock step by step, tick by tock, take-stock.

Adi began with the assumption that there is an overarching unity to “it.” He then studied how a distribution of signs of “it” (i.e., symbols that make up human languages describing “it”) manages to remain true to the unity of “it,” despite constant change. Such change, it can be argued, arrives in the guise or form of uneven or unequal adoption, selection, and retention factors, as seen in the overwhelming evidence of a continuous “morphogenesis” in as much as the formation, change and meaning of wordsfacts and other things, over eons.

To determine how people interpret the intensity and sensibility or “information” projected with language by means of speech acts (with messages, composed of words) — Adi investigated the sounds of symbols used to compose a real human language when most people were inventing artificial, specialized, logical and less general languages.  Adi chose to study the unambiguous sounds of Classical Arabic that have remained unchanged for 1400 years until present day.  That sound affects what we see is in no way some incidental trivia or minutia.

At the least, it helps truth break free of being bound to mere correspondence, a relegation reminiscent of mime or mimicry. Adi’s findings set truth free,  liberates truth, to soar to heights more amenable — such as high-fidelity,–  than those that burn out in the heated brilliance of spectacular failure.  In fact, in early implementations of our software we had an overt relevance measure called “fidelity” that users could set and adjust.  It speaks to the core of equilibrium that permeates this approach to conceptual modelling, analysis, searching for relevance and significance, subject and topic classification and practical forms of text analytics in general.

Tom Adi’s semantic theory interprets the intensity, gradient trajectory and causal sensibility of an idea presumably communicated as information in the speech acts of people. This “measure” of Adi’s (or we may call it “Adi’s Measure”) can be understood as a measure of the increase in the magnitude (intensity) of a property of psychological intension. (e.g., like a temperature or pressure change or change in concentration) observable in passing from one point or moment to another. Thus, while invisible, it can be perceived as the rate of such a change.

In my view, it is in the action of amplitude, signifying a change from conceptual, cognitive or imaginative will or possibility, to implementation or actualization in terminal reality. Computationally, it is and can be used as a vector formed by the operator ∇ acting on a scalar function at a given point in a scalar field. It has been implemented in an algorithm as an operating principle, resonating —   acting/reacting (revolving, evolving) as a rule, i.e.; being an operator: conditioning, i.e., coordinating/re-coordinating,  a larger metric system or modelling mechanism (e.g., Readware; text analytics, in general).

I mention this to contrast Adi’s work with that of Shannon, who, in order to frame information according to his theory of communications, did a thorough statistical analysis of ONLY the English language. After that analysis, Shannon defined information as entropy or uncertainty on the advice of Von Neumann.  The communications of information (an outcome) involves things which Shannon called messages and probabilities for those things. Both elements were represented abstractly by Shannon: the things as symbols (binary numbers) and probability simply as a decimal number.

So you see, Shannon’s information represents abstract values based on a statistical study of English. Adi’s, information, on the other hand, represents sensible and elementary natural processes that are selected, adopted and retained for particular use within conventional language –as a mediating agency– in an interpersonal or social act of communications. Adi’s information is based upon a diachronic study of the Arabic language and the confirming study in fourteen additional languages, including modern English, German and French, Japanese and Russian, all having suffered the indisputable and undeniable effects of language change — both different from and independent of the evolution of language, or the non-evolution, as-it-were, of Classical Arabic.

Adi’s theory is a wholly different treatment of language, meaning and information than either Shannon or Morse attempted or carried out on their own merits. It is also a different treatment of language than information statistics gives, as it represents the generation of salient and indispensable rules in something said or projected using language. It is different from NLP or Natural Language Processing which depend (heavily) on the ideas of uncertainty and probability.

A “concept search” in Adi’s calculation and my estimation, is not a search in the traditional sense of matching keys in a long tail of key information.  A “concept search” seeks mathematical fidelity, resonance or equilibrium and symmetry (e.g., invariance under transformation) between a problem (query for information) and possible solutions (i.e., “responses” to the request for information) in a stated frame or window (context) on a given information space (document stack, database).  A search is conducted by moving the window (e.g., the periscope) over the entirety of the information space in a scanning or probing motion.  While it ought be obvious, we had to “prove” that this approach works, which we did in outstanding form, in NIST and DARPA reviewed performances.

Adi’s theory is not entirely free of uncertainty as it is, after all, only theoretical. But it brings a new functionality, a doctrinal functionality, to the pursuit of certainty by way of a corresponding reduction of doubt. That is really good news. In any case, this is a theory that deserves and warrants consideration as a modern information theory that stands in stark contrast to the accepted norm or status-quo.

Read Full Post »