Wednesday, September 22, 2010

Closed Time-Like Curves, Progressive Revelation, Continuous Discovery, Creative Sustainability, Free-Will, The Super-Copernican Principle

Sean Carroll On The Mysteries Of Time:

It's easy to find a dictionary definition of "time." But ask a group of theoretical physicists and the answer isn't as clear. Sean Carroll of CalTech discusses the mysteries of time in his book, From Eternity to Here: The Quest for the Ultimate Theory of Time.

"In mathematical physics, a closed timelike curve (CTC) is a worldline in a Lorentzian manifold, of a material particle in spacetime that is "closed," returning to its starting point. This possibility was first raised by Kurt Gödel in 1949, who discovered a solution to the equations of general relativity (GR) allowing CTCs known as the Gödel metric; and since then other GR solutions containing CTCs have been found, such as the Tipler cylinder and traversable wormholes. If CTCs exist, their existence would seem to imply at least the theoretical possibility of time travel backwards in time, raising the spectre of the grandfather paradox, although the Novikov self-consistency principle seems to show that such paradoxes could be avoided. Some physicists speculate that the CTCs which appear in certain GR solutions might be ruled out by a future theory of quantum gravity which would replace GR, an idea which Stephen Hawking has labeled the chronology protection conjecture. Others note that if every closed timelike curve in a given space-time passes through an event horizon, a property which can be called chronological censorship, then that space-time with event horizons excised would still be causally well behaved and an observer might not be able to detect the causal violation.[1]"

‎"Eternal return (also known as "eternal recurrence") is a concept which posits that the universe has been recurring, and will continue to recur, in a self-similar form an infinite number of times. The concept initially inherent in Indian philosophy was later found in ancient Egypt, and was subsequently taken up by the Pythagoreans and Stoics. With the decline of antiquityand the spread of Christianity, the concept fell into disuse, though Friedrich Nietzsche resurrected it.

In addition, the philosophical concept of eternal recurrence was addressed by Arthur Schopenhauer. It is a purely physical concept, involving no supernatural reincarnation, but the return of beings in the same bodies. Time is viewed as being not linear but cyclical.

According to Heidegger, it is the burden imposed by the question of eternal recurrence—whether or not such a thing could possibly be true—that is so significant in modern thought: "The way Nietzsche here patterns the first communication of the thought of the "greatest burden" [of eternal recurrence] makes it clear that this "thought of thoughts" is at the same time "the most burdensome thought."[2] The thought of eternal recurrence appears in a few of his works, in particular §285 and §341 of The Gay Science and then in Thus Spoke Zarathustra. It is also noted in a posthumous fragment.[3] The origin of this thought is dated by Nietzsche himself, via posthumous fragments, to August 1881, at Sils-Maria. In Ecce Homo (1888), he wrote that he thought of the eternal return as the "fundamental conception" of Thus Spoke Zarathustra.[4]

"History does not repeat itself, but it does rhyme." - Mark Twain

"In computer science, corecursion is a type of operation that is dual to recursion. Corecursion is often used in conjunction with lazy evaluation. Corecursion can denote both finite andinfinite data structures, and may or may not employ self-referential data structures.

The rule for primitive corecursion on codata is the dual to that for primitive recursion on data. Instead of descending on the argument by pattern-matching on its constructors, we ascend on the result by filling-in its "destructors" (or "observers"). Notice that corecursion creates (potentially infinite) codata, whereas ordinary recursion analyses (necessarily finite) data. Ordinary recursion might not be applicable to the codata because it might not terminate. Conversely, corecursion is not strictly necessary if the result type is data, because data must be finite."

Predicativity, Circularity, and Anti-Foundation

The Anti-Foundation Axiom, AFA, has turned out to be a versatile principle in set theory for modelling a plethora of circular and self-referential phenomena. This paper explores whether AFA and the most important tools emanating from it, such as the solution lemma and the co-recursion principle, can be developed on predicative grounds, that is to say, within a predicative theory of sets.

If one can show that most of the circular phenomena that have arisen in computer science do not require impredicative set existence axioms for their modeling, this would demonstrate that their circularity is of a different kind than the one which underlies impredicative definitions.

"The vicious circle principle was suggested by Henri Poincaré (1905-6, 1908)[2] and Bertrand Russell in the wake of the paradoxes as a requirement on legitimate set specifications. Sets which do not meet the requirement are called impredicative.

The first modern paradox appeared with Cesare Burali-Forti's 1897 A question on transfinite numbers[3] and would become known as the Burali-Forti paradox. Cantor had apparently discovered the same paradox in his (Cantor's) "naive" set theory and this become known as Cantor's paradox. Russell's awareness of the problem originated in June 1901[4] with his reading of Frege's treatise of mathematical logic, his 1879 Begriffsschrift; the offending sentence in Frege is the following:

"On the other hand, it may be also be that the argument is determinate and the function indeterminate"[5].

In other words, given f(a) the function f is the variable and a is the invariant part. So why not subsititute the value f(a) for f itself? Russell promptly wrote Frege a letter pointing out that:

"You state ... that a function too, can act as the indeterminate element. This I formerly believed, but now this view seems doubtful to me because of the following contradiction. Let wbe the predicate: to be a predicate that cannot be predicated of itself. Can w be predicated of itself? From each answer its opposite follows. There we must conclude that w is not a predicate. Likewise there is no class (as a totality) of those classes which each taken as a totality, do not belong to themselves. From this I conclude that under certain circumstances a definable collection does not form a totality"[6].

Frege promptly wrote back to Russell acknowledging the problem:

"Your discovery of the contradiction caused me the greatest surprise and, I would almost say, consternation, since it has shaken the basis on which I intended to build arithmetic"[7].

While the problem had adverse personal consequences for both men (both had works at the printers that had to be emended), van Heijenoort observes that "The paradox shook the logicians' world, and the rumbles are still felt today. ... Russell's paradox, which makes use of the bare notions of set and element, falls squarely in the field of logic. The paradox was first published by Russell in The principles of mathematics (1903) and is discussed there in great detail..."[8]. Russell, after 6 years of false starts, would eventually answer the matter with his 1908 theory of types by "propounding his axiom of reducibility. It says that any function is coextensive with what he calls a predicative function: a function in which the types of apparent variables run no higher than the types of the arguments"[9]. But this "axiom" was met with resistance from all quarters.

The rejection of impredicatively defined mathematical objects (while accepting the natural numbers as classically understood) leads to the position in the philosophy of mathematicsknown as predicativism, advocated by Henri Poincaré and Hermann Weyl in his Das Kontinuum. Poincaré and Weyl argued that impredicative definitions are problematic only when one or more underlying sets are infinite."

"The Hebrew view of time also includes the concept that time moves from event to event in a line---not a straight line, to be sure, but towards a goal. The goal is always the future, yet the goal intended by God is always to be fulfilled in history. Bible prophecies frequently have both an immediate and a long-term fulfillment, for example. In the Bible, sins are seen to have consequences that follow inevitably, moral choices lead to measurable results for good or for ill, and history proceeds towards the definite outworking purposes of God.

"Enter by the narrow gate; for the gate is wide and the way is easy, that leads to destruction, and those who enter by it are many. For the gate is narrow and the way is hard, that leads to life, and those who find it are few." (Matthew 7:14, 15)

A consummation of the ages lies ahead, for which all else has been but a shadowy preparation. In both ancient Greek culture, (among the Pythagoreans, Stoics and Neoplatonists), and in Hindu culture (especially during the Vedic period, 1500-600 BC), one runs onto the concept of circular, or cyclical time. This is sometimes symbolized by the uroboros, the snake chasing his own tail. In this view of time, the beginning leads back around to the end, and the cycle starts all over again. The Babylonians, ancient Chinese, Aztecs, Mayans, and the Norse had cyclical calendars."

Backs to the Future
Aymara Language and Gesture Point to Mirror-Image View of Time

By Inga Kiderra

"Tell an old Aymara speaker to “face the past!” and you just might get a blank stare in return – because he or she already does.

New analysis of the language and gesture of South America’s indigenous Aymara people indicates a reverse concept of time.

Contrary to what had been thought a cognitive universal among humans – a spatial metaphor for chronology, based partly on our bodies’ orientation and locomotion, that places the future ahead of oneself and the past behind – the Amerindian group locates this imaginary abstraction the other way around: with the past ahead and the future behind."

"Nietzsche’s deepest interest and admiration for the Persians manifest themselves where he discusses their notion of history and cyclical time. This Persian concept of time resembles to some degree his own concept of the circle of the Eternal Recurrence, expressed in a highly poetic and dramatic manner in his Zarathustra. Through this concept Nietzsche emphasizes the cyclical nature of cosmic time and the recurrence of all beings in every “circle”: “I must pay tribute to Zarathustra, a Persian (einem Perser): Persians were the first to have conceived of History in its full extent” (Sämtliche Werke, XI, p. 53). In this fragment Nietzsche uses the Persian word hazār referring to the millennial cycles (hazāra) in ancient Persian religious beliefs, “each one presided by a prophet; every prophet having his own hazar, his millennial kingdom.” In Also Sprach Zarathustra, he speaks of the great millennial (“grosser Hazar”) kingdom of his own Zarathustra, as “our great distant human kingdom, the Zarathustra kingdom of a thousand year,” (“Das Honigopfer”[The Honey Sacrifice,] Part IV)."

‎"Time and the Containment of Evil in Zoroastrianism

As Baha'is we hold as one of our essential principles, the Oneness of Religion. Yet our conceptions of what a religion consists of: a Prophet bearing a Book revealed by God, consisting of
the moral basis of how people ought to live, was born largely within the confines of the Middle East and forms part of religious construct Marshall Hodgson, the great Islamicist, termed the Irano-Semitic tradition.

There are a number of common elements to be found within this overall tradition, which includes Zoroastrianism, Judaism, Christianity, Islam and the Babi/Baha'i Faith. They all share the belief that there is a one powerful and good Creator of the cosmos, both physical and spiritual, not reducible to any image, visible or mental, who has revealed the path for human's to follow. While God is usually conceived to be singular, the primary issue is not that there is One God, but rather that there is One Truth. When humans fail to follow this divinely revealed path they and the cosmos itself suffers from the consequences of turning away from God. Why do human beings go astray? The usual explanation within these traditions is that there lies an evil force within the cosmos which seks to draw us away from God. This evil force is often personified as the devil, Satan or Ahriman. Because of this emphasis on the singularity of Truth and the belief in the power of demonic, religions within the Irano-Semitic tradition have the less attractive reputation for being intolerant. Polytheistic system, on the other hand are quite tolerant-- I'll worship your gods if you'll worship mine. (Baha'i Faith succeeded in overcoming this *bane* of prophetic religion by 1) Asserting the universality of revelation 2) Affirming the relativity of all religious truth 3) Largely rejecting the belief in the demonic.
Tradition has it that during the final 3000 years after Zoroaster three Sayoshants or Benefactors will be born at 1000 year intervals. The origins of this belief is unclear but passages in the Gathas themselves do suggest that He taught that after Him would come "the man who is better than a good man- the one who will teach us for the physical existence and for that of the mind, the straight paths of salvation to the true things with which Ahura Mazda dwells-- who is faithful and resembles You, O Mazda." But later legends spoke of this series of three saviors who would each be born of virgins, miraculously impregnated with the seed of Zoroaster which has been preserved in Lake Kasaoya.
In this final three thousand years, lineal time has become somewhat of a spiral, there is sense of what Windengren termed "cylic revelation." The messenger comes, things get better for a time, but until the coming of the last Sayoshant they again start to decline. While these cycles appear similiar to those common in other religious traditions, note that there remains a directionality to all this. We are not spinning our wheels going nowhere. This spiral conception of time and revelation will be echoed in Ismaeli Shi'ism as well as the Baha'i concept of progressive revelation.

Abdu'l-Baha interpretated this prophecy:

"the first Dispensation to which it refers is the Muhammadan Dispensation during which the Sun of Truth stood still for ten days. Each day is reckoned as one centry. The Muhammadan Dispensation must have, therefore, lasted no less than one thousand years, which is precisely the period that has elapsed from the setting of the Star of the Imamate to the advent of the Dispensation proclaimed by the Bab. The second Dispensation referred to in this prophesy is the one inaugurated by the Bab Himself, which began in the year 1260 A.H. and was brought to a close in the year 1280 A.H. As to the third Dispensation--the Revelation proclaimed by Baha'u'llah--inasmuch as the Sun of Truth when attaining that station shineth in the plenitude of its meridian splendor its duration hath been fixed for a period of one whole month, which is the maxiumum time taken by the suns to pass through a sign of the Zodiac. From this thou cants imagine the magnitude of the Baha'i cycle--a cycle that must extend over a period of at least five hundered thousand years." WOB pp. 101-102.

After Iran fell under Islamic domination, the expectation of the Final Renovation came to be replaced by hopes of a liberator who would re-establish the "Good religion" and the political autonomy of Iran. This figure came to be known as Shah Bahram, said to be a descendent of the last Sassanian king, Yazdigird. Since the remnant of the Sassanian house was said to have fled eastward the popular belief was that he would come from China or India to restablish Iran's independence and reunite the spheres of prophethood and kingship which had been split asunder at the time of Jamshid. Eventually the figure of Shah Bahram came to eclipse all other images of the Sayoshants. By the time of Baha'u'llah Shah Bahram came to be equated with Sayoshant.


The Zoroastrian conception of time, whether lineal or spiral, gave value to the present unrepeatable moment and endowed every act of humanity in history with ultimate meaning. More importantly, it gave hope for the future of the final defeat of the forces are darkness and the Renovation of the world in which we live. Where Zoroastrianism did not spread as a religion, these ideas came none-the-less to penetrate the entire Abrahamic traditions. Even Christianity, which was more inclined to devalue the physical world and hope for a heavenly reward, carried forward this vision in their prayers, "Thy Kingdom Come, Thy Will be done, on earth as it is in Heaven." Within the Baha'i Faith we see the culmination of these hopes as we endeavor to carry out that Final Renovation foretold so long ago."

Abdul Baha explained that what is meant by the “return is not return of the essence, but that of the qualities; it is not the return of the Manifestation, but that of the perfections.”

Additionally, Abdul Baha explained why the concept of reincarnation, as it exists in Eastern philosophies, is inaccurate. He used analogies of the natural world. Abdul Baha stated that everything is evolving and changing, and that uniqueness exists in all living things. For example, when the leaves on the trees die and fall off in autumn, and then when the leaves return in the spring, they are not the same exact leaves. They may be green and have the same attributes; however, they are not exactly the same as the leaves that were on the tree in the previous season. In this way, repetition of the species exists, but the complete replication of one representation of that species is impossible.

Shoghi Effendi also addresses this topic in an excerpt in “Lights of Guidance”: “Spiritually we must develop here what we will require for the life after death...It is not necessary for us to come back and be born into another body in order to advance spiritually and to grow closer to God.”

As Baha’is we need to be mindful that we must respect other people’s beliefs, even if we don’t agree or understand them. Abdul Baha prefaced his commentary on this subject by saying, “The object of what we are about to say is to explain the reality-not to deride the beliefs of other people; it is only to explain the facts; that is all. We do not oppose anyone’s ideas, nor do we approve of criticism.”

"Although Baha'is recognize that, literally, scientifically, humans are descended from animals, they recognize that humans are more than just hairless apes. Humans have an intellectual and spiritual dimension that separates them from animals, just as they share the physical dimension with animals.
The Baha'i perspective unifies the concepts of divine creation and evolution by natural selection, two concepts that other traditions cannot bring themselves to unify. The Baha'i view of knowledge offers a unique perspective on religion and science in our lives.

We need, Baha'is teach us, both religion and science to understand the theory of evolution as well as to progress as individuals and as a society. Science gives us the ability as religion gives us the reason to aspire for a just and peaceful world. At the risk of putting my words (and I am not a Baha'i, just someone who has a great respect for that tradition) into their mouths, it seems to me that on the issue of evolution, Baha'is would say that where we are going is more important than where we have been."

Religion in the future global civilization: globalization is intensifying religious conflicts. What will happen in the years ahead?

"First, just as the goal of the predominant Asian religions is liberation from reincarnation, monotheism itself needs to be liberated from the boundaries that each of the Abrahamic religions has erected around it.

What is needed is an expanded definition of monotheism that opens the door for constructive dialogue and leads to a new way of thinking about God's multireligion revelations as combined insights. This new form of monotheism might be called "radical monotheism." It might allow the devoted followers of all the Abrahamic faiths to see themselves as spiritual and moral partners who are equal in revealing truths about God. Their combined revelations would provide a more comprehensive understanding of God's purposes for humanity than the specific revelations of any single faith.

Second, radical monotheism might also innovate the concept of revelation. One of the central tenets of radical monotheism might be "continuous" rather than "progressive" revelation. The idea of continuous revelation preserves many of the strengths of progressive revelation and also goes beyond them by stimulating greater openness to interfaith conversation. Within the Abrahamic religions, progressive revelation has been viewed as a stepladder to perfection. The concept of continuous revelation negates the belief that later revelations, even within a single tradition, are necessarily superior to earlier ones. Continuous revelation combines chronology and equality in the same way that all the pearls in a necklace are lined up next to each other and contribute equally to its beauty.

Continuous revelation implies that no single past revelation contains God's final Truth. It also assumes that God will continue to reveal new truths to humanity in the future. Accepting the concept of continuous revelation means that no religion possesses or uses God exclusively for its own purposes. It allows God to be God. Each religion's truths are part of a larger repository that includes the revealed truths of multiple religions.

Thus, even though no overarching worldview currently exists to unify the world religions, the potential to move in this direction does exist. This begins with the recognition--first by individuals, then by congregations, communities, and so on--that Truth is greater than the capacity of any single religion to grasp it in full. Starting with the idea that God does exist and employing the broad concept of radical monotheism, the Middle Eastern religions can conclude that God has provided multiple revelations, because divine transcendence surpasses any religion's capacity to contain all of Truth. If enough people are able to accept this idea, the potential exists for significant cross-religious dialogue. Thus, while it might seem counterintuitive, both the Asian religion of Jainism and Middle Eastern radical monotheism lead to the same conclusion that Truth is many, even though they start at the opposite ends of the worldview spectrum.

The parallel between Western radical monotheism and Hindu pantheism is equally striking, although once again the connection between the two is not immediately evident. Hinduism begins with the affirmation that Truth is one and there are many paths to it. This allows for widespread diversity among Hindus regarding their choice of worship rituals and deity images. Radical monotheism presupposes that the multiple revelations of the Middle Eastern religions in combination convey insights into the nature of God to a far greater extent than those of any single tradition. When the theological ground is shifted from exclusivist monotheism to radical monotheism, it is but a small step to recognize that God has provided multiple revelations. In other words, like Hinduism, the Middle Eastern religions in total provide multiple paths to God.

In sum, the goal of bringing greater peace and justice into the global village would be well served if the devoted followers of the world religions were to (1) view their collective insights as multiple pathways to understanding ultimate reality and (2) commit to identifying through open dialogue their combined, although individually limited, truths.

One of the ironies of comparing world religions is that, despite their theological and philosophical differences, they share a common core of moral values. Unlike the challenges related to integrating monotheism, pantheism, and atheism at the worldview level, combining values across interfaith boundaries is relatively easy and straightforward. Each religion values specific virtues based on an image of ideal character development. Despite dissimilarities on issues such as gender, caste, and economic equity, the world religions share and have always shared a common core of values, such as compassion, mercy, love, kindness, and justice. "

"Baha'is believe that God sends his teachers to his school, from time to time with new lessons, to help advance the people to a higher and higher level of humanness. Trouble is, they believe, that people cling to the old school-work and the old teacher and doggedly resist accepting the new teacher and his teachings. Baha'is think of God's prophets as renovators who come from time to time to tear down walls of separation and to bring God's children together in an open-air general classroom out of their own foolishly walled-in dungeons of exclusivity and ignorance.

Below are some of the Baha'i teachings that clash head on with Islam's and provoke the Islamists to do all they can to destroy the new religion.

* The people of God. Muslims believe that they are the chosen people of Allah and recognize no other system of belief as legitimate. Baha'is believe that all people are the chosen people of God: that there is only one God, one religion of God, and one people of God, the entire human race.

* Pearls on a string. Muslims contend that Muhammad is the seal of the Prophets; that God sent his best and final messenger to mankind, and any other claimant is an imposter worthy of death. Baha'is believe that God has always sent his teachers with new and updated lessons to educate humanity and shall do so in the future. There have been numberless divine teachers in the course of human history who have appeared to various people. They say that these teachers are like pearls on a string and that Baha'u'llah is the latest, but not the last pearl.

* Independent thinking. Blind imitation is anathema to Baha'is. Baha'is believe that the human mind and the gift of reason should guide the person in making decisions about all matters. To this end, they place a premium on education and independent investigation of truth."

Hajj Hassanain Rajabali speaking about the concept of God and Evolution in Islam

hat this theory is describing has to do with the effect of the 'future' as well as the 'past' on the present.

‎"This essay will suggest that the emergence of a novel scientific worldview that places life and intelligence at the center of the vast, seemingly impersonal physical processes of the cosmos may offer the best hope for meeting this uniquely daunting challenge.


‎"But the great naturalist would immediately recognize that there is a crucial difference between the process of natural selection as it operated in the distant past and the novel possibilities currently open to the evolutionary process. A 21st century version of Charles Darwin would conclude that, while a vision of time’s immensity remains the vital key in reaching an understanding of evolution’s radical potential, it is a realization of the fathomless magnitude of future time and future history that is of utmost importance today.

A modern Darwin would concur with the conclusion of Princeton physicist, John Wheeler: most of the time available for life and intelligence to achieve their ultimate capabilities lie in the distant cosmic future, not in the cosmic past.

As cosmologist Frank Tipler has bluntly stated, “Almost all of space and time lies in the future. By focusing attention only on the past and present, science has ignored almost all of reality. Since the domain of scientific study isthe whole of reality, it is about time science decided to study the future evolution of the universe.”

Although you won’t read about it in any New York Times or Wall Street Journal headlines, the disruptive potential of future evolution is the emerging leitmotif in advanced biological theorizing today. The current ID vs. Darwinism dust-up on which the popular press focuses myopically will turnout to be a minor historical footnote to the portentous evolutionary dramathat is about to reveal itself in all its unnerving grandeur.


"For purposes of the present inquiry, the key perspective is offered by what physicist John Wheeler calls the super- Copernican principle. Derived from the Copenhagen interpretation of quantum physics, this “principle rejects the now-centeredness of any account of existence as firmly as Copernicus rejected here-centeredness.”

According to this principle, the future can have at least as important a role in shaping the present moment as the past.
The most important aspect of Wheeler’s insight is not that we must embrace the specific mechanism of retroactive causation favored by Wheeler and the advocates of the Copenhagen interpretation of quantum mechanics (the retroactive impact on quantum phenomena of observer-participancy), but rather that we should be open to counterintuitive notions of causation, if they appear to be consistent with novel yet mathematically plausible accounts of physical reality.

In particular, the vision of the cosmos as a closed timelike curve that allows at least limited information flow across the putative Big Bounce threshold offers a new paradigm that may allow us to formulate radically novel theoretical possibilities concerning the origin and nature of biological information and of the specified complexity it exhibits.

According to this paradigm, the process of biological information generation can be viewed as an essentially eternal autocatalytic process in which past and future temporal states are linked in a coevolutionary relationship. The wave of causation moves from what we call the past to what we call the future and back again to the past across the Big Crunch era to a new Big Bang era without disruption (but, we shall see shortly, with possible causal filtering).

Causation defines the relationship between all points on the CTC, but the relationship of cause and effect is not temporally restricted in the sense we naively perceive.

As Wheeler put it with uncanny prescience (though with a different causal mechanism in mind), the history of the cosmos “is not a history as we usually conceive history. It is not one thing happening after another after another. It is a totality in which what happens ‘now’ gives reality to what happened ‘then,’ perhaps even determines what happened then.” Because the CTC is curved and timelike and closed and unblemished by a final singularity, each point on the CTC is, to at least a limited degree, both the cause and effect of every other point.Time flows in only one direction in this scenario but because the CTC unites past and future at the Big Crunch threshold, the two temporal states can coevolve.

The CTC that is hypothesized to be our cosmos thus may be a classic autocatalytic set, what Wheeler ventured to call a “self-excited circuit” and a “grand synthesis, pulling itself together all the time as a whole.” The implication for the origin of biological information should be apparent: not only the universe but also the life-friendly cosmic code and indeed life itself (and the specified complexity it embodies) could conceivably be its own mother under this scenario."

‎"It From Bit: Reality educes and/or produces itself in the form of information residing in quantum events.

As Wheeler summarizes in his paper Information, Physics, Quantum: The Search for Links, “...every physical quantity, every it, deriv
es its ultimate significance from bits, binary yes- or-no indications...”

He then goes on to discuss this concept at length, offering three questions, four “no’s” and five “clues” about the quantum-informational character of reality.

The questions are as follows:

(1) How come existence?
(2) How come the quantum?
(3) How come the “one world” out of many observer-participants?

The no’s, seductive pitfalls to be avoided in answering the three questions, include

no tower of turtles,
no laws,
no continuum,
and no space or time.

And the clues, which light the way toward the true answers, include

the boundary of a boundary is zero;
No question? No answer!;
the Super-Copernican Principle;
“consciousness” (including the quotes);
and more is different."
"The Super-Copernican Principle: Just as Copernicus displaced geocentricity with heliocentricity, showing by extension that no particular place in the universe is special and thereby repudiating “here-centeredness”, the Super-Copernican Principle says that no particular point in time is special, repudiating “now-centeredness”.

Essentially, this means that where observer-participation functions retroactively, the participatory burden is effectively distributed throughout time. So although the “bit-size” of the universe is too great to have been completely generated by the observer-participants who have thus far existed, future generations of observer- participants, possibly representing modes of observer-participation other than that associated with human observation, have been and are now weighing in from the future. (The relevance of this principle to the Participatory Anthropic Principle is self-evident.)" - Langan, PCID, 2002

Information, Physics, Quantum: The Search for Links (J. A. Wheeler)

The constants of nature:

from Alpha to Omega--the numbers that encode the deepest secrets of the universe John D. Barrow

Intelligent universe:

AI, ET, and the emerging mind of the cosmos James Gardner

"Q: Einstein says that gravity is a result of "mass-energy" causing a curvature in the four dimensional space time continuum. At the planck scale, (10^(-33)) centimeters, is space still continuous?, or is space discontinuous? I have read books saying space time may have holes or breaks in continuity. Are these holes related in any way to "gravitons", or reverse time causality? (Question from Russell Rierson)

A: A mathematical space is continuous if it has a metric that withstands infinitesimal subdivision. To understand what this means, one must know what a "metric" is. Simplistically, a metric is just a general "distance" relationship defined on a space as follows: if a and b are two points in a space, and c is an arbitrary third point, then the distance between a and b is always less than or equal to the sum of the distances between a and c, and b and c. That is, where d(x,y) is the distance between two points x and y,

d(a,b) <= d(a,c) + d(b,c). If this relationship continues to hold no matter how close together the points a, b and c might be, then the space is continuous. On the other hand, where the distance concept is undefined below a certain threshold, metric continuity breaks down on that scale. Since the Planck limit is such a threshold, space is discontinuous below the Planck scale...implying, of course, that it is discontinuous, period. Not only is it "granular" in a slippery kind of way, but the grains in question are effectively without spatial extent. Because space and time are undefined below quantum limits, they no longer have extensionality or directionality. But if we interpret this to mean that anything, including causality, can "flow" in any direction whatsoever, then reverse causality is conceivable on sub-Planck scales. In fact, some theorists conjecture that on these scales, continuous spacetime becomes a chaotic "quantum foam" in which distant parts of the universe are randomly connected by microscopic "wormholes". That's pretty much the party line among physicists. Now let's bring philosophy to bear on the issue. At one time, space was considered to consist of "ether", a quasimaterial "substance" through which physical objects were thought to swim like fish through water. But since the introduction of Einstein's Theory of Relativity, nothing material remains of empty space; although it is permeated by fields and "vacuum energy", these are merely contained by space and are not equivalent to space itself. Space has instead become a mathematical abstraction called a "tensor field" that confers relative attributes like location, direction, orientation, distance, linear and angular velocity, and geometry on physical objects and energy fields. Because empty space, as abstracted from its contents, cannot be observed and has no observable effect on anything, it is not "physical" in the usual sense. That which is immaterial is abstract, and abstraction is a mental process that "abstracts" or educes general relationships from observations. So from a philosophical viewpoint, saying that space is immaterial and therefore abstract amounts to saying that it is "mental"...that it is to some extent composed of mind rather than matter. Although this runs against the scientific grain, it is consistent with our dominant physical theories of the very large and the very small, namely relativity and quantum mechanics. In relativity, space and time are combined in an abstract manifold called "spacetime" whose "points" are physical events that can be resolved in terms of mutual behavioral transduction of material objects, a process fundamentally similar to mentation. And quantum mechanics characterizes matter in terms of abstract, immaterial wave functions that are physically actualized by interactions of an equally immaterial nature. What does this mean regarding the continuity of spacetime? Simply that like spacetime itself, continuity and its quantum-scale breakdown are essentially mental rather than material in character. As Berkeley observed centuries ago, reality is ultimately perceptual, and as we know from the subsequent debate between Hume and Kant, perception conforms to mental categories... categories like space and time. So rather than being purely objective and "physical" in a materialistic sense, space has a subjective aspect reflecting the profoundly mental nature of our reality. Gravitons, though subject to some of the same reasoning, are another matter.

Q: Does the CTMU allow for the existence of souls and reincarnation?

A: From the CTMU, there emerge multiple levels of consciousness. Human temporal consciousness is the level with which we're familiar; global (parallel) consciousness is that of the universe as a whole. The soul is the connection between the two...the embedment of the former in the latter.

In the CTMU, reality is viewed as a profoundly self-contained, self-referential kind of "language", and languages have syntaxes. Because self-reference is an abstract generalization of consciousness - consciousness is the attribute by virtue of which we possess self-awareness - conscious agents are "sublanguages" possessing their own cognitive syntaxes. Now, global consciousness is based on a complete cognitive syntax in which our own incomplete syntax can be embedded, and this makes human consciousness transparent to it; in contrast, our ability to access the global level is restricted due to our syntactic limitations.

Thus, while we are transparent to the global syntax of the global conscious agency "God", we cannot see everything that God can see. Whereas God perceives one total act of creation in a parallel distributed fashion, with everything in perfect superposition, we are localized in spacetime and perceive reality only in a succession of locally creative moments. This parallelism has powerful implications. When a human being dies, his entire history remains embedded in the timeless level of consciousness...the Deic level. In that sense, he or she is preserved by virtue of his or her "soul". And since the universe is a self-refining entity, that which is teleologically valid in the informational construct called "you" may be locally re-injected or redistributed in spacetime. In principle, this could be a recombinative process, with the essences of many people combining in a set of local injections or "reincarnations" (this could lead to strange effects...e.g., a single person remembering simultaneous "past lifetimes").

In addition, an individual human sublanguage might be vectored into an alternate domain dynamically connected to its existence in spacetime. In this scenario, the entity would emerge into an alternate reality based on the interaction between her local level of consciousness and the global level embedding it...i.e., based on the state of her "soul" as just defined. This may be the origin of beliefs regarding heaven, hell, purgatory, limbo and other spiritual realms.

Q: If I have interpreted you correctly, you maintain that the universe created itself. How did this come about? What existed before the Universe and when did the Universe create itself or come into being? - Celia Joslyn

A: You're asking three distinct but related questions about cosmology: how, when and as what did the universe self-create?

The universe can be described as a cybernetic system in which freedom and constraint are counterbalanced. The constraints function as structure; thus, the laws of physics are constraints which define the structure of spacetime, whereas freedom is that which is bound or logically quantified by the constraints in question. Now, since there is no real time scale external to reality, there is no extrinsic point in time at which the moment of creation can be located, and this invalidates phrases like "before reality existed" and "when reality created itself". So rather than asking "when" the universe came to be, or what existed "before" the universe was born, we must instead ask "what would remain if the structural constraints defining the real universe were regressively suspended?" First, time would gradually disappear, eliminating the "when" question entirely. And once time disappears completely, what remains is the answer to the "what" question: a realm of boundless potential characterized by a total lack of real constraint. In other words, the real universe timelessly emerges from a background of logically unquantified potential to which the concepts of space and time simply do not apply.

Now let's attend to your "how" question. Within a realm of unbound potential like the one from which the universe emerges, everything is possible, and this implies that "everything exists" in the sense of possibility. Some possibilities are self-inconsistent and therefore ontological dead ends; they extinguish themselves in the very attempt to emerge into actuality. But other possibilities are self-consistent and potentially self-configuring by internally defined evolutionary processes. That is, they predicate their own emergence according to their own internal logics, providing their own means and answering their own "hows". These possibilities, which are completely self-contained not only with respect to how, what, and when, but why, have a common structure called SCSPL (Self-Configuring Self-Processing Language). An SCSPL answers its own "why?" question with something called teleology; where SCSPL is "God" to whatever exists within it, teleology amounts to the "Will of God"."

"Infocognition is the monic substance that results from removing the Cartesian distinction between self (mind) and other (external reality)

space and time are generalized information and cognition respectively subjectivity is simply reflexive infocognition or infocognitive domain for which the spatiotemporal radius is minimal and thus coherent

Enlarge the radius relative to a given conspansive layer of spacetime, and the system decoheres into subject-object interaction.

phenomenon is explained by something called "syndiffeonesis", describing a paradoxiform identity relation (where paradoxiform means "having the nature of a self-resolving paradox")

That's one of three primary metalogical principles adjoined to pure logic by the theory.

CTMU conspansion is a logical operation with simultaneous deductive and inductive aspects corresponding to directions of time.

cosmology closes around a primal event called "incoversion" at which the most general physical restriction of SCSPL syntax is spatiotemporally expressed

a physical spacetime singularity can still possess a special kind of abstract structure

CTMU rescues most of standard big bang cosmology using special dualization principles reminiscent of those utilized in membrane theory.

The universe becomes an infocognitive endomorphism; space is a logical form that evolves (literally) by logical substitution, and time the implementation of substitutive grammar.

Instead of particles being "transmitted" from one point of space to another through the familiar kinematic osmosis based on a dissociation of logic and geometry, the motion of a particle is locally expressed within its own prior image, i.e., wavefunction.

CTMU self-creative reality

NeST provides a framework in which free will and freedom can exist, but to see it, one needs to subject it to a distributed involution effecting spatiotemporal closure.

"You mention Bill Dembski’s 3-way distinction between determinacy, nondeterminacy (chance) and design. In the CTMU, this distinction comes down to the 3-way distinction between determinacy, nondeterminacy and self-determinacy, the last being associated with telic recursion and the others being secondarily defined with respect to it. Telic recursion is just another term for "metacausation"; instead of simply outputting the next state of a system, it outputs higher-order relationships between state and law (or state and syntax).Regarding the distinction between origins and evolution, not too many people are clear on it. This distinction is based on the standard view of causality, in which there seems to be a clean distinction between the origin and application of causal principles, specifically first-order Markovian laws of nature. In the CTMU, origins distribute over causes in a new kind of structure called a conspansive manifold, and are therefore not cleanly distinguishable from causality. Both are products of a higher-order process, telic recursion. To put it in simpler terms, evolution consists of events which originate in causes which originate in (teleological) metacauses. So in the CTMU, to talk about evolution is to talk about metacausal origins by ontogenic transitivity." ... "Entitled "The Resolution of Newcomb's Paradox", it utilized a (then brand new) computational model of the universe based on nested virtual realities. The model was called NeST, short for Nested Simulation Tableau. Subsequently, other papers on the CTMU were published in that journal and elsewhere, some developing its cosmological implications. This can all be documented."

"Newcomb's problem calls for one to infer, from given a set of well-defined conditions, which of two alternatives should be selected in order to maximize a certain monetary expectation. It is apparently the impression of some members that the correct solution is "obvious" unless a certain condition ("omniscience") is suspended, at which point all possible solutions are trivial conversions of unknowns into other unknowns. This, however, is where Newcomb's paradox enters the picture. The paradox evolves from higher level (meta-linguistic) consideration of mechanisms implied by the "obvious" solution, whatever that may be to a given solver; it is the upper floor of a split-level maze. The controversy exists solely among those who wander its lower corridors without being able to reach the ledges above, More's the pity, for there resides all meaning."

"The question posed by Newcomb's problem involves the computative analysis, by a predictive agency with computative characteristics, of the computative analysis undertaken by a transducer on a given input. That input is the problem itself, presented in the manner prescribed by the formulation. This situation, which defines a computative regression, is recursive and inductively extensible. The regression in turn defines the only soluble context for the higher-level "paradox" generated by the problem. This context translates as mechanism. The mechanism is a stratified automaton G containing both the predictor and its object-transducer as sub-automata. Whether "free will" is defined determlnistically as mere outside non-interference in m and d, or nondeterministically as the ability of MN to override any exogenous restriction of mn or dn, its mechanism is contained in that of G.

Logical diagonalization of the formal computational language generated by the accepting syntax of MN directly implies that certain structural aspects of G may be unrecognizable to MN. In particular, those aspects involving MN-relativized nondeterminacy, as well as those involving certain higher-order predicates of the nondistributive, nonlocal organizations involving mM and dM, are formally undecidable to it and need not be recognized directly by it with any degree of specificity. To understand why, consider the extent to which a common computer "recognizes" the extended system including its cpu, its other components, its programmers, and the environment it inhabits. In fact, it can recognize nothing that does not conform to its input-to-output transformational grammar. Even if it were self-analytic, such analysis could be limited to a schematic electronic syntax which overlooks the material elements of which it is constructed. In any case, it can make sense of nothing but strings of input translated and rearranged according to the internal stratification of its hard and soft programming.

You, your purposes, and your dependencies are undecidable to it, and so are the mechanisms by which you can predict and control its behavior. It matters not who formulates this undecidability; if the machine's internal logic is inadequate to do so, yours surely is not (currently, most mechanical acceptors are nongeneralistic, treating complementation as negation and negation as rejection; this bars the tools of diagonalization from their computations). Should it ignore your higher prerogatives, you could "diagonalize" it - if nothing extrinsic to the machine were to stop you - with a sledgehammer whose effects on it do not depend on its acceptance. By analogy, Newcomb's object-transducer MN cannot preclude G on grounds of "insensibility". Nor, for chat matter, can we.

There are many self-styled experts on undecidability who have expressed the opinion that all attempts to reify Godel's theorem along paranormal lines reflect a misunderstanding of its "real nature". Such experts are quite correct in that a misunderstanding exists, but the misunderstanding is all theirs. What the theorem forces by juxtaposing truth and derivability (or consistency and completeness) is a hierarchical stratification of classes of truth functions and the inferential syntaxes which parametrize them. This stratification follows that of G, fractionating computative reality along with the "truth" to which it corresponds.

The stratification of G induces stratum-relativizations of computative time and space. Thus, the timetype in which MN computes recognition and output is a mere subtype of that in which it is programmed. Dynamical "arrows of determinacy" which are inviolable to MN, being programmed into its accepting syntax, have no force whatsoever to the programmatic agencies themselves. This applies just as well to "metrical" restrictions embodied in the MN-syntax; these may allow MN to recognize nothing but an artificial submetric of the metric in which these agencies define their own existence. MN and its reality might consist of quanta with higher-dimensional interpretations as the termini of channels for the transmission of information between strata. Metatemporal predicates may exist with respect to which those of MN, are definite only in a mutual sense; predicates which MN accepts as "before" and "after" could be the programmatic projections of "in front of" and "in back of", or any other G-consistent higher-prepositional relationships.

There can thus exist a mechanism x c G through which a predictor like ND could measure and/or control the mappings d, m c MN in ways directly insensible to MN. Where in G relative to MN would such a predictor have to be located? Precisely where access is available. Sirnplistically, we might characterize the predictor-M relationship as one of proper inclusion, where it is understood that prediction is direct rather than second-hand, and programmatic in the passive and active senses. That is, a programmer mentally internalizes the structure of that which he programs, and this internalization amounts to computative inclusion. The fine structure of G, while to a degree analytic, is a natter of some complexity. For now, it will suffice to have demonstrated the possibility of x and its utility to well-situated G-subautomata. Because G is structured to allow for relativized deteririnacy and nondeterminacy, the solution is invariant with respect to argumentation involving mind-brain dichotomy. That is, such dichotomies reduce to distinctions of determinacy and nondeterminacy, and may be treated in kind.

Restricted dominance, which relies on probabilistic independence derived from the lower-order, local istic dynamical timetype of MN's artificially restricted "reality", is revealed under G-extension to be 'itself dominated by utility. That is, the subjective utility of MN forces the assimilation by dM of this entire demonstration, which disables restricted dominance and thus frees the strategic component to recognize higher patterns among observed data. The principle of restricted dominance, though valid as long as the reality of MN remains unbreached, loses all force in the presence of exodynamic influence.

Let's sum it up. You can be modeled as a deterministic or nondeterministic transducer with an accepting syntax that can be diagonalized, or complemented by logical self-negation. ND can be modeled as a metalogical, metamechanistic programmatic agency, some insensible part of whom surrounds you in a computational space Including physical reality, but not limited thereto. This space is the mechanistic equivalent of the computative regression around which Newcomb's problem is essentially formulated. The existence of this space cannot be precluded by you on the grounds that you cannot directly observe it, nor can it be said by you to deny ND a mechanism of control and prediction of your thought and behavior. Additionally, you have an open-ended run of data which lowers to 1/ 8 the probability that NO is "just lucky". This implies that mechanism does indeed exist, and warrants the adjunction to the axioms of physics an independent, empirical physical axiom affirming that mechanism. This then implies that ND can predict or control human thought and behavior (a somewhat weaker implication, you will notice, than "omniscience"). ND possesses means, motive, opportunity...and you. You are "possessed" by Newcomb's Demon, and whatever self-interest remains to you will make you take the black box only. (Q.E.D.)"

‎"Thus choice…is a mechanical process compatible with determinism: choice is a process of examining assertions about what would be the case if this or that action were taken, and then selecting an action according to a preference about what would be the case. The objection The agent didn’t really make a choice, because the outcome was already predetermined is as much a non sequitur as the objection The motor didn’t really exert force, because the outcome was already predetermined…Both choice making and motor spinning are particular kinds of mechanical processes. In neither case does the predetermination of the outcome imply that the process didn’t really take place. (p. 192, original italics)"

"The idea of superrationality is that two logical thinkers analyzing the same problem will think of the same correct answer. For example, if two persons are both good at arithmetic, and both have been given the same complicated sum to do, it can be predicted that both will get the same answer before the sum is known. In arithmetic, knowing that the two answers are going to be the same doesn't change the value of the sum, but in game theory, knowing that the answer will be the same might change the answer itself."

Good and Real: Demystifying Paradoxes from Physics to Ethics by Gary Drescher

"This book tries to derive ought from is. The more important steps explain why we should choose the one-box answer to Newcomb’s problem, then argue that the same reasoning should provide better support for Hofstadter’s idea of superrationality than has previously been demonstrated, and that superrationality can be generalized to provide morality."

"In Good and Real, Gary Drescher examines a series of provocative paradoxes about consciousness, choice, ethics, quantum mechanics, and other topics, in an effort to reconcile a purely mechanical view of the universe with key aspects of our subjective impressions of our own existence.Many scientists suspect that the universe can ultimately be described by a simple (perhaps even deterministic) formalism; all that is real unfolds mechanically according to that formalism. But how, then, is it possible for us to be conscious, or to make genuine choices? And how can there be an ethical dimension to such choices? Drescher sketches computational models of consciousness, choice, and subjunctive reasoning—what would happen if this or that were to occur?—to show how such phenomena are compatible with a mechanical, even deterministic universe. Analyses of Newcomb's Problem (a paradox about choice) and the Prisoner's Dilemma (a paradox about self-interest vs. altruism, arguably reducible to Newcomb's Problem) help bring the problems and proposed solutions into focus. Regarding quantum mechanics, Drescher builds on Everett's relative-state formulation—but presenting a simplified formalism, accessible to laypersons—to argue that, contrary to some popular impressions, quantum mechanics is compatible with an objective, deterministic physical reality, and that there is no special connection between quantum phenomena and consciousness.In each of several disparate but intertwined topics ranging from physics to ethics, Drescher argues that a missing technical linchpin can make the quest for objectivity seem impossible, until the elusive technical fix is at hand."


We must believe in free will. We have no choice. -- Isaac B. Singer

"Conway and Kochen do not prove that free will does exist. The definition of "free will" used in the proof of this theorem is simply that an outcome is "not determined" by prior conditions, and some philosophers strongly dispute the equivalence of "not determined" with free will."

‎"Logical axioms are usually statements that are taken to be universally true (e.g., A and B implies A), while non-logical axioms (e.g., a + b = b + a) are actually defining properties for the domain of a specific mathematical theory (such as arithmetic).

When used in the latter sense, "axiom," "postulate", and "assumption" may be used interchangeably. In general, a non-logical axiom is not a self-evident truth, but rather a formal logical expression used in deduction to build a mathematical theory.
Outside logic and mathematics, the term "axiom" is used loosely for any established principle of some field."

"As Hintikka says, "Gödel's incompleteness result does not touch directly on the most important sense of completeness and incompleteness, namely, descriptive completeness and incompleteness," the sense in which an axiom systems describes a given field. In particular, the result "casts absolutely no shadow on the notion of truth. All that it says is that the whole set of arithmetical truths cannot be listed, one by one, by a Turing machine." Equivalently, there is no algorithm which can decide the truth of all arithmetical propositions. And that is all."

"The Conway-Kochen proof of the Freewill Theorem relies on three axioms they call SPIN, TWIN and FIN:

Particles have the 101-property. This means whenever you measure the squared spin of a spin-1 particle in any three mutually perpendicular directions, the measurements will be two 1s and a 0 in some order.

There is a finite upper bound to the speed at which information can be transmitted.

If two particles together have a total angular momentum of 0, then if one particle has an angular momentum of s, the others must necessarily have an angular momentum of -s.

Conway expanded on each of these axioms during his talk. He insisted that given his proof, if you disagreed with his conclusion, you must necessarily also disagree with one of these axioms. These are axioms and so they are stated without proof, however, the two axioms SPIN and TWIN can be experimentally tested and verified. Moreover some of these experiments have actually been performed and they support SPIN and TWIN.

Conway stated that although he believed FIN to be true, he pointed out that, experimentally, FIN is the most contentious of the three axioms. It cannot be verified experimentally. The theories of relativity state that the speed of light c is the upper bound on the speed at which information transfer occurs. FIN does not require the theory of relativity to be correct (FIN requires any upper bound not necessarily c) although it would be sufficient. "We do not know if some unknown method allows for instantaneous transfer of information", Conway laughed, "almost by definition."

"Composition is of three kinds.

1. Accidental composition.
2. Involuntary composition.
3. Voluntary composition.

There is no fourth kind of composition. Composition is restricted to these three categories."

"Diagram 1: 1. Indeterminacy 2. External determinacy 3a. Self-determinacy 3b. Intrinsic self-determinacy (The effectual aspect of the object or event has simply been moved inside the causal aspect, permitting the internalization of the blue arrow of determinacy and making causality endomorphic.)
Determinacy and indeterminacy…at first glance, there seems to be no middle ground. Events are either causally connected or they are not, and if they are not, then the future would seem to be utterly independent of the past. Either we use causality to connect the dots and draw a coherent picture of time, or we settle for a random scattering of independent dots without spatial or temporal pattern and thus without meaning. At the risk of understatement, the philosophical effects of this assumed dichotomy have been corrosive in the extreme. No universe that exists or evolves strictly as a function of external determinacy, randomness or an alternation of the two can offer much in the way of meaning. Where freedom and volition are irrelevant, so is much of human experience and individuality.

But there is another possibility after all: self-determinacy. Self-determinacy is like a circuitous boundary separating the poles of the above dichotomy…a reflexive and therefore closed boundary, the formation of which involves neither preexisting laws nor external structure. Thus, it is the type of causal attribution suitable for a perfectly self-contained system. Self-determinacy is a deep but subtle concept, owing largely to the fact that unlike either determinacy or randomness, it is a source of bona fide meaning. Where a system determines its own composition, properties and evolution independently of external laws or structures, it can determine its own meaning, and ensure by its self-configuration that its inhabitants are crucially implicated therein."

"Question: If God does in fact continuously create reality on a global level such that all prior structure must be relativized and reconfigured, is there any room for free-will?

Answer: Yes, but we need to understand that free will is stratified. As a matter of ontological necessity, God, being ultimately identified with UBT, has "free will" on the teleological level...i.e., has a stratified choice function with many levels of coherence, up to the global level (which can take all lower levels as parameters). Because SCSPL local processing necessarily mirrors global processing - there is no other form which it can take - secondary telors also possess free will. In the CTMU, free will equates to self-determinacy, which characterizes a closed stratified grammar with syntactic and telic-recursive levels; SCSPL telors cumulatively bind the infocognitive potential of the ontic groundstate on these levels as it is progressively "exposed" at the constant distributed rate of conspansion."
"More is different: The potential for complexity increases with cardinality; with large numbers of elements comes combinatorial variety and the potential for the sort of multilevel logical structure that typifies biological organisms and modern computers alike. This is a fundamental precept of complexity theory. Wheeler poses a question: “Will we someday understand time and space and all the other features that distinguish physics—and existence itself—as the self-generated organs of a self-synthesized information system?” ... And the CTMU describes the universe as just the sort of complex, teleologically self-variegating, self-synthesized information system prescribed by more is different, telic-recursively explicating multiplicity and diffeonesis from the unity and synesis of distributed SCSPL syntax, the (unique) CTMU counterpart of what has sometimes been called “the Implicate Order”."

"This profound concept is the transactional choice principle. The same non-linearity that makes mass become infinite at the speed of light in special relativity causes particles to have two solutions, one travelling in each direction in space-time - forwards and backwards with opposite energies. All the forces are mediated by virtual particles which appear transiently out of the 'fabric' of quantum uncertainty and thus must have both an emitter (creator) and an absorber (annihilator). The transactional principle asserts that all particles, including the real ones which make up radiation and matter, are similarly linked and that a space-time handshaking occurs between emitter and absorber. The essential difference between real and virtual particles is that every possible virtual particle coexists, but only certain real outcomes occur in our experience. In this case, the boundary conditions say only one real interaction out of all the many possibilities can occur. The Feynman diagram and transaction are illustrated in fig 6 centre.

The transactional principle asserts that this choice is made through space-time handshaking and that the apparent 'randomness' of quantum uncertainty may mask a very complex web of such hand-shaking interactions across space-time, in which quantum 'information' is exchanged, both from future to past and from past to future. This principle neatly explains all the known mysteries of quantum non-locality. It may also unearth free-will.

Far from being an accidental irrelevancy in the universe at large, biomolecules are the final interaction in the cosmogenic wave-particle hierarchy, beginning with cosmic symmetry-breaking. The most energetic of these forces interact first to form composites like the proton and then in stars to form the atomic nuclei. These then combine chemically at lower energies (because the electromagnetic force is weaker) to form the next fractal hierarchy of interaction, atoms and molecules and finally at lower energies still to form the global weak bonding associations of large biomelecular assemblies."

"In contrast to that, Baas pointed out that more naturally the above situations are thought of from the beginning in terms of hierarchies of what he calls bonds, where, quite generally, a bond is an object equipped with information of how a collection of sub-bonds sits inside it, bound by the bond."

"CTMU >> CAMU in Camo

Before we explore the conspansive SCSPL model in more detail, it is worthwhile to note that the CTMU can be regarded as a generalization of the major computation-theoretic current in physics, the CAMU. Originally called the Computation-Theoretic Model of the Universe, the CTMU was initially defined on a hierarchical nesting of universal computers, the Nested Simulation Tableau or NeST, which tentatively described spacetime as stratified virtual reality in order to resolve a decision-theoretic paradox put forth by Los Alamos physicist William Newcomb (see Noesis 44, etc.). Newcomb's paradox is essentially a paradox of reverse causality with strong implications for the existence of free will, and thus has deep ramifications regarding the nature of time in self-configuring or self-creating systems of the kind that MAP shows it must be. Concisely, it permits reality to freely create itself from within by using its own structure, without benefit of any outside agency residing in any external domain. Although the CTMU subjects NeST to metalogical constraints not discussed in connection with Newcomb's Paradox, NeST-style computational stratification is essential to the structure of conspansive spacetime. The CTMU thus absorbs the greatest strengths of the CAMU ± those attending quantized distributed computation ± without absorbing its a priori constraints on scale or sacrificing the invaluable legacy of Relativity. That is, because the extended CTMU definition of spacetime incorporates a self-referential, self-distributed, self-scaling universal automaton, the tensors of GR and its many-dimensional offshoots can exist within its computational matrix. An important detail must be noted regarding the distinction between the CAMU and CTMU. By its nature, the CTMU replaces ordinary mechanical computation with what might better be called protocomputation. Whereas computation is a process defined with respect to a specific machine model, e.g. a Turing machine, protocomputation is logically "pre-mechanical". That is, before computation can occur, there must (in principle) be a physically realizable machine to host it. But in discussing the origins of the physical universe, the prior existence of a physical machine cannot be assumed. Instead, we must consider a process capable of giving rise to physical reality itself...a process capable of not only implementing a computational syntax, but of serving as its own computational syntax by self-filtration from a realm of syntactic potential. When the word "computation" appears in the CTMU, it is usually to protocomputation that reference is being made. It is at this point that the theory of languages becomes indispensable. In the theory of computation, a "language" is anything fed to and processed by a computer; thus, if we imagine that reality is in certain respects like a computer simulation, it is a language. But where no computer exists (because there is not yet a universe in which it can exist), there is no "hardware" to process the language, or for that matter the metalanguage simulating the creation of hardware and language themselves. So with respect to the origin of the universe, language and hardware must somehow emerge as one; instead of engaging in a chicken-or-egg regress involving their recursive relationship, we must consider a self-contained, dual-aspect entity functioning simultaneously as both. By definition, this entity is a Self-Configuring Self-Processing Language or SCSPL. Whereas ordinary computation involves a language, protocomputation involves SCSPL. Protocomputation has a projective character consistent with the SCSPL paradigm. Just as all possible formations in a language - the set of all possible strings - can be generated from a single distributed syntax, and all grammatical transformations of a given string can be generated from a single copy thereof, all predicates involving a common syntactic component are generated from the integral component itself. Rather than saying that the common component is distributed over many values of some differential predicate - e.g., that some distributed feature of programming is distributed over many processors - we can say (to some extent equivalently) that many values of the differential predicate - e.g. spatial location - are internally or endomorphically projected within the common component, with respect to which they are "in superposition". After all, difference or multiplicity is a logical relation, and logical relations possess logical coherence or unity; where the relation has logical priority over the reland, unity has priority over multiplicity. So instead of putting multiplicity before unity and pluralism ahead of monism, CTMU protocomputation, under the mandate of a third CTMU principle called Multiplex Unity or MU, puts the horse sensibly ahead of the cart. To return to one of the central themes of this article, SCSPL and protocomputation are metaphysical concepts. Physics is unnecessary to explain them, but they are necessary to explain physics. So again, what we are describing here is a metaphysical extension of the language of physics. Without such an extension linking the physical universe to the ontological substrate from which it springs - explaining what physical reality is, where it came from, and how and why it exists - the explanatory regress of physical science would ultimately lead to the inexplicable and thus to the meaningless.

Spacetime Requantization and the Cosmological Constant The CTMU, and to a lesser extent GR itself, posits certain limitations on exterior measurement. GR utilizes (so-called) intrinsic spacetime curvature in order to avoid the necessity of explaining an external metaphysical domain from which spacetime can be measured, while MAP simply states, in a more sophisticated way consistent with infocognitive spacetime structure as prescribed by M=R and MU, that this is a matter of logical necessity (see Noesis/ECE 139, pp. 3-10). Concisely, if there were such an exterior domain, then it would be an autologous extrapolation of the Human Cognitive Syntax (HCS) that should properly be included in the spacetime to be measured. [As previously explained, the HCS, a synopsis of the most general theoretical language available to the human mind (cognition), is a supertautological formulation of reality as recognized by the HCS. Where CTMU spacetime consists of HCS infocognition distributed over itself in a way isomorphic to NeST ± i.e., of a stratified NeST computer whose levels have infocognitive HCS structure ± the HCS spans the laws of mind and nature. If something cannot be mapped to HCS categories by acts of cognition, perception or reference, then it is HCS-unrecognizable and excluded from HCS reality due to nonhomomorphism; conversely, if it can be mapped to the HCS in a physically-relevant way, then it is real and must be explained by reality theory.] Accordingly, the universe as a whole must be treated as a static domain whose self and contents cannot "expand", but only seem to expand because they are undergoing internal rescaling as a function of SCSPL grammar. The universe is not actually expanding in any absolute, externallymeasurable sense; rather, its contents are shrinking relative to it, and to maintain local geometric and dynamical consistency, it appears to expand relative to them. Already introduced as conspansion (contraction qua expansion), this process reduces physical change to a form of "grammatical substitution" in which the geometrodynamic state of a spatial relation is differentially expressed within an ambient cognitive image of its previous state. By running this scenario backwards and regressing through time, we eventually arrive at the source of geometrodynamic and quantum-theoretic reality: a primeval conspansive domain consisting of pure physical potential embodied in the self-distributed "infocognitive syntax" of the physical universe«i.e., the laws of physics, which in turn reside in the more general HCS. Conspansion consists of two complementary processes, requantization and inner expansion. Requantization downsizes the content of Planck's constant by applying a quantized scaling factor to successive layers of space corresponding to levels of distributed parallel computation. Thisinverse scaling factor 1/R is just the reciprocal of the cosmological scaling factor R, the ratio of the current apparent size dn(U) of the expanding universe to its original (Higgs condensation) size d0(U)=1. Meanwhile, inner expansion outwardly distributes the images of past events at the speed of light within progressively-requantized layers. As layers are rescaled, the rate of inner expansion, and the speed and wavelength of light, change with respect to d0(U) so that relationships among basic physical processes do not change«i.e., so as to effect nomological covariance. The thrust is to relativize space and time measurements so that spatial relations have different diameters and rates of diametric change from different spacetime vantages."

Physics meets philosophy at the Planck scale:

contemporary theories in quantum gravity

"The greatest challenge in fundamental physics attempts to reconcile quantum mechanics and general relativity in a theory of "quantum gravity." The project suggests a profound revision of the notions of space, time and matter. It has become a key topic of debate and collaboration between physicists and philosophers. This volume collects classic and original contributions from leading experts in both fields for a provocative discussion of the issues. It contains accessible introductions to the main and less-well-known known approaches to quantum gravity. It includes exciting topics such as the fate of spacetime in various theories, the so-called "problem of time" in canonical quantum gravity, black hole thermodynamics, and the relationship between the interpretation of quantum theory and quantum gravity. This book will be essential reading for anyone interested in the profound implications of trying to marry the two most important theories in physics."

On space and time

"What is the true nature of space and time? These concepts are at the heart of science, but they remain deeply wrapped in mystery. Both house their structure at the smallest pre-subatomic and the largest cosmological levels continues to defy modern physics and may require revolutionary new ideas for which science is still grasping. This unique volume brings together world leaders in cosmology, particle physics, quantum gravity, mathematics, philosophy and theology, to provide fresh insights into the deep structure of space and time. Andrew Taylor, Shahn Majid, Roger Penrose, Alain Connes, Michael Heller, and John Polkinghorne all experts in their respective fields, explain their theories in this outstanding compiled text."

No comments:

Post a Comment