Monday, September 27, 2010

Situations, Formal Concept Analysis, Mereotopological Measurement, Ontologic Refinement, Sheaf Semantics, Syntactic Coverings, Biextensional Collapse

Image: http://www.ctmu.net/

"In situation analysis (SA), an agent observing a scene receives information...Considering the logical connection between belief and knowledge, the challenge for the designer is to transform the raw, imprecise, conflictual and often paradoxical information received from the different sources into statements understandable by both man and machines."
http://www.fusion2004.foi.se/papers/IF04-0400.pdf


Decision support through constraint propagation in collaborative distributed command and control

"In this paper we develop a conceptual model of the interdependences among plans that can be expected to emerge in a collaborative, distributed command and control center. The foundations of the model are the problem space representation of problem solving and analyses of the nature of constraints and their propagation and of the task of planning. The model has informed the development of a series of empirical studies of the propagation of constraints in a simulated command and control center. The CBFire microworld is the test bed for the studies. Analysis of the behavioral data captured by C3Fire would serve to inform the design of an intelligent interface for decision support in command and control that highlights constraints on action and facilitates human decision making"
http://tinyurl.com/2bkdapg

"Research on dynamic systems was motivated partly because traditional IQ tests turned out to be weak predictors in non-academic environments (see Rigas & Brehmer, 1999, p. 45). According to their proponents, computer-simulated “microworlds ” seem to possess what is called “ecological validity”. Simulations of (simplified) industrial production (e.g., Moray, Lootsteen, & Pajak, 1986), medical systems (e.g., Gardner & Berry, 1995), or political processes (e.g., Dörner, 1987) have the appeal of bringing “real-world tasks” to the laboratory. Brehmer and Dörner (1993) argue that these scenarios escape both the narrow straits of the laboratory and the deep blue sea of the field study, because scenarios allow for a high degree of fidelity with respect to reality and at the same time allow systematic control of influential factors.
...
Subjects acting in these scenarios did indeed face a lot more tasks than in the IQ tests: (a) the complexity of the situation and (b) the connectivity between a huge number of variables forced the actors to reduce a large amount of information and anticipate side effects; (c) the dynamic nature of the problem situation required the prediction of future developments (a kind of planning) as well as long-term control of decision effects; (d) the intransparency (opaqueness) of the scenarios required the systematic collection of information; (e) the existence of multiple goals (polytely) required the careful elaboration of priorities and a balance between contradicting, conflicting goals."
http://archiv.ub.uni-heidelberg.de/volltextserver/volltexte/2008/8234/pdf/Funke_2001_TR.pdf

Complex problem solving: the European perspective
http://books.google.com/books?id=36k6JR2_8nkC&lr=&source=gbs_navlinks_s

Komplexes Problemlösen vor dem Hintergrund der Theorie finiter Automaten:

"The theory of finite state automata is presented as a useful tool for problem solving research. For investigations of how people interact with discrete dynamic systems the approach suggests hypotheses about system knowledge acquisition and representation, it serves to deduce knowledge measures, and it enables researchers to describe and to construct entire classes of discrete dynamic task environments. A number of experimental studies are described in order to illustrate the approach."
http://www.psycontent.com/content/26p537x67836411x/

Goal specificity effects on hypothesis testing in problem solving:

"Previous research has found that having a nonspecific goal (NSG) leads to better problem solving and transfer than having a specific goal (SG). To distinguish between the various explanations of this effect requires direct evidence showing how a NSG affects a participant's behaviour. Therefore we collected verbal protocols from participants learning to control a linear system consisting of 3 outputs by manipulating 3 inputs. This system was simpler than the one we had used previously, so in Exp. 1 we generalized our earlier goal specificity findings to this system. In Exp. 2 protocol analysis confirmed our prediction (based on dual-space theories of problem solving) that NSG participants focused on hypothesis testing whereas SG participants focused on the goal. However, this difference only emerged over time. We also replicated the goal specificity effect on performance and showed that giving participants a hypothesis to test improved performance."
http://www.informaworld.com/smpp/content~db=all~content=a713756030

Quinn: Why so many big words. Here's the elevator pitch version: "In Situational Analysis -- people see s**t and try to figure it out."

Hamid: Not just people, machines too...Canada's defense force of the future?

Barwise and Perry

Situations, unlike worlds, are not complete in the sense that every proposition or its negation holds in a world. According to Situations and Attitudes, meaning is a relation between a discourse situation, a connective situ
ation and a described situation. The original theory of Situations and Attitudes soon ran into foundational difficulties. A reformulation based on Peter Aczel's non-well-founded set theory was proposed by Barwise before this approach to the subject petered out in the early 1990s.

Angelika Kratzer

Barwise and Perry's system was a top-down approach which foundered on practical issues which were early identified by Angelika Kratzer and others. She subsequently developed a considerable body of theory bottom-up by addressing a variety of issues in the areas of context dependency in discourse and the syntax-semantics interface. Because of its practical nature and ongoing development this body of work "with possible situations as parts of possible worlds, now has much more influence than Barwise and Perry’s ideas".
http://en.wikipedia.org/wiki/Situation_semantics

Situation Theory and Situation Semantics:

“The world consists not just of objects, or of objects, properties and relations, but of objects having properties and standing in relations to one another. And there are parts of the world, clearly
recognized (although not precisely individuated) in common sense and human language. These parts of the world are called situations. Events and episodes are situations in time, scenes are visually perceived situations, changes are sequences of situations, and facts are situations enriched (or polluted) by language.
...
‎Constraints

Types and the type abstraction procedures provide a mechanism for capturing the fundamental process whereby a cognitive agent classifies the world. Constraints provide the situation theoretic mechanism that captures the way tha
t agents make inferences and act in a rational fashion. Constraints are linkages between situation types. They may be natural laws, conventions, logical (i.e., analytic) rules, linguistic rules, empirical, law-like correspondences, etc. For example, humans and other agents are familiar with the constraint"
http://www.stanford.edu/~kdevlin/HHL_SituationTheory.pdf

The Development of Categorical Logic:
Topological Completeness for Higher-Order Logic:

"Using recent results in topos theory, two systems of higher-order logic are shown to be complete with respect to sheaf models over topological spaces—so-called “topological semantics”. The first is classical higher-order logic, with relational quantification of finitely high type; the second system is a predicative fragment thereof with quantification over functions between types, but not over arbitrary relations. The second theorem applies to intuitionistic as well as classical logic."

Sheaf Semantics for Concurrent Interacting Objects:

Constructive Sheaf Semantics:

"Sheaf semantics is developed within a constructive and predicative framework, Martin-Lof’s type theory. We prove strong completeness of many sorted, first order intuitionistic logic with respect to this semantics, by using sites of provably functional relations."

Classifying Toposes for First Order Theories:

Lemuel:
How refined is the measurement?

Hamid: Alternatively, how does the measurement refine?

Lemuel: I am referring to the measurement process as it has been shown in some other studies that the mathematical model is adjusted along quantum lines.

Lemuel: Hamid, how would one handle false signals? Say, a false missile launch? I am thinking along widely varying issues, sort of tricks, once you know the system.

Hamid: One approach to dealing with false signals or "Error-Correction" is a form of heirarchical redundancy which globally "sums-over" local-deviations from perfectly complementary attributive (state-syntax or topological-descriptive) mappings between obj
ects having properties sharing mutual relations.

"Short answer:
An ontology is a specification of a conceptualization.

The word "ontology" seems to generate a lot of controversy in discussions about AI. It has a long history in philosophy, in which it refers to the subject of existence. It is also often confused with epistemology, which is about knowledge and knowing.

In the context of knowledge sharing, I use the term ontology to mean a specification of a conceptualization. That is, an ontology is a description (like a formal specification of a program) of the concepts and relationships that can exist for an agent or a community of agents. This definition is consistent with the usage of ontology as set-of-concept-definitions, but more general. And it is certainly a different sense of the word than its use in philosophy.

What is important is what an ontology is for. My colleagues and I have been designing ontologies for the purpose of enabling knowledge sharing and reuse. In that context, an ontology is a specification used for making ontological commitments. The formal definition of ontological commitment is given below. For pragmetic reasons, we choose to write an ontology as a set of definitions of formal vocabulary. Although this isn't the only way to specify a conceptualization, it has some nice properties for knowledge sharing among AI software (e.g., semantics independent of reader and context). Practically, an ontological commitment is an agreement to use a vocabulary (i.e., ask queries and make assertions) in a way that is consistent (but not complete) with respect to the theory specified by an ontology. We build agents that commit to ontologies. We design ontologies so we can share knowledge with and among these agents.
...
Ontologies as a specification mechanism

A body of formally represented knowledge is based on a conceptualization: the objects, concepts, and other entities that are assumed to exist in some area of interest and the relationships that hold among them (Genesereth & Nilsson, 1987). A conceptualization is an abstract, simplified view of the world that we wish to represent for some purpose. Every knowledge base, knowledge-based system, or knowledge-level agent is committed to some conceptualization, explicitly or implicitly.

An ontology is an explicit specification of a conceptualization. The term is borrowed from philosophy, where an Ontology is a systematic account of Existence. For AI systems, what "exists" is that which can be represented. When the knowledge of a domain is represented in a declarative formalism, the set of objects that can be represented is called the universe of discourse. This set of objects, and the describable relationships among them, are reflected in the representational vocabulary with which a knowledge-based program represents knowledge. Thus, in the context of AI, we can describe the ontology of a program by defining a set of representational terms. In such an ontology, definitions associate the names of entities in the universe of discourse (e.g., classes, relations, functions, or other objects) with human-readable text describing what the names mean, and formal axioms that constrain the interpretation and well-formed use of these terms. Formally, an ontology is the statement of a logical theory.[1]

We use common ontologies to describe ontological commitments for a set of agents so that they can communicate about a domain of discourse without necessarily operating on a globally shared theory. We say that an agent commits to an ontology if its observable actions are consistent with the definitions in the ontology. The idea of ontological commitments is based on the Knowledge-Level perspective (Newell, 1982) . The Knowledge Level is a level of description of the knowledge of an agent that is independent of the symbol-level representation used internally by the agent. Knowledge is attributed to agents by observing their actions; an agent "knows" something if it acts as if it had the information and is acting rationally to achieve its goals. The "actions" of agents---including knowledge base servers and knowledge-based systems--- can be seen through a tell and ask functional interface (Levesque, 1984) , where a client interacts with an agent by making logical assertions (tell), and posing queries (ask).

Pragmatically, a common ontology defines the vocabulary with which queries and assertions are exchanged among agents. Ontological commitments are agreements to use the shared vocabulary in a coherent and consistent manner. The agents sharing a vocabulary need not share a knowledge base; each knows things the other does not, and an agent that commits to an ontology is not required to answer all queries that can be formulated in the shared vocabulary.

In short, a commitment to a common ontology is a guarantee of consistency, but not completeness, with respect to queries and assertions using the vocabulary defined in the ontology.

Notes

[1] Ontologies are often equated with taxonomic hierarchies of classes, but class definitions, and the subsumption relation, but ontologies need not be limited to these forms. Ontologies are also not limited to conservative definitions, that is, definitions in the traditional logic sense that only introduce terminology and do not add any knowledge about the world (Enderton, 1972) . To specify a conceptualization one needs to state axioms that do constrain the possible interpretations for the defined terms."
http://www-ksl.stanford.edu/kst/what-is-an-ontology.html

"The first paper uses the channel logic of Barwise and Seligman to define ontology morphisms, which can be used to integrate ontologies. The second paper uses the local logics of channel theory and Chu spaces to formalize a duality between ontologies and their instantiantiations; this paper also cites my early work on using colimits (from category theory) for knowledge integration, and also includes some interesting examples of integrating knowledge in ontologies over different logics. The third paper discusses composition operations on ontologies languages, again using the local logics, and also discusses the akt editor for applying such operations. The fourth paper discusses a tool for evolving large distributed knowledge based on ontologies, making use of histories of transformations.

The first three papers are closely related to my work on "institutions," an abstract axiomization of the notion of "logical system," much of it with Rod Burstall of Edinburgh (see section 1.4). Local logics and Chu spaces are both special cases of institutions, and the duality is also a special case of the syntax-semantics duality that is built into institutions. Moreover, the ontology morphisms of the first paper are a special case of theory morphisms over an institution. (For the cognescenti, V-insitutions generalize Chu spaces, and were proposed for similar applications long before Chu spaces.) Local logics do not appear to allow a sufficiently strong distinction between the object level of ongologies and the meta level of ontology languages; this distinction is much clearer with institution theory, and also, it is know how to obtain much more powerful composition operations in that framework, because composition of parameterized software modules is one of its major applications."
http://cseweb.ucsd.edu/groups/tatami/seek/

Von Schweber Living Systems
http://www.slideshare.net/Annie05/von-schweber-living-systems-presentation

"The name Synsyta stems from a term in cell biology, syncytium : Tissue in which multiple cells come together and operate as a single cell, as is the case with neurons and muscle cells. The central image to the right is a syncytium.

“It is ironic that gap junctions connect together neurons and glia, at least transiently, into a sort of reticular syncytium— Golgi’s idea overthrown by Cajal’s demonstration of discrete neurons and chemical synapses. Gap junction assemblies of transiently woven-together neurons have been termed “hyper-neurons” and suggested as a neural correlate of consciousness.” (Consciousness, the Brain, and Spacetime Geometry, Hameroff)

“As few as three gap junction connections per cortical neuron (with perhaps thousands of chemical synapses) to neighboring neurons and glia which in turn have gap junction connections elsewhere may permit spread of cytoplasmic quantum states throughout significant regions of the brain, weaving a widespread syncytium whose unified interior hosts a unified quantum state or field (Hameroff & Penrose, 1996; Woolf & Hameroff, 2001).”

A Syncytium, shown below, is a group of cells that come together and act as one cell. Neurons and muscle cells form syncytiums."
http://www.synsyta.com/

"In recent years gamma synchrony has indeed been shown to derive not from axonal spikes and axonal-dendritic synapses, but from post-synaptic activities of dendrites. Specifically, gamma synchrony/40 Hz is driven by networks of cortical in...ter-neurons connected by dendro-dendritic “electrotonic” gap junctions, windows between adjacent cells. Groups of neurons connected by gap junctions share a common membrane and fire synchronously, behaving (as Eric Kandel says) “like one giant neuron.” Gap junctions have long been recognized as prevalent and important in embryological brain development, but gradually diminish in number (and presumably importance) as the brain matures. Five years ago gap junctions were seen as irrelevant to cognition and consciousness. However more recently, relatively sparse gap junction networks in adult brain have been appreciated and shown to mediate gamma synchrony/40 Hz.1-11 Such networks are transient, coming and going like the wind (and Hebbian assemblies), as gap junctions form, open, close and reform elsewhere (regulated by intraneuronal activities). Therefore neurons (and glia) fused together by gap junctions form continually varying syncytia, or Hebbian “hyper-neurons” whose common membranes depolarize coherently and may span wide regions of cortex. (The internal cytoplasm of hyper-neurons is also continuous, prompting suggestions they may host brain-wide quantum states.) By virtue of their relation to gamma synchrony, gap junction hyper-neurons may be the long-sought neural correlate of consciousness (NCC)."
http://www.quantumconsciousness.org/EEGmeditation.htm

"In formal ontology, a branch of metaphysics, and in ontological computer science, mereotopology is a first-order theory, embodying mereological and topological concepts, of the relations among wholes, parts, parts of parts, and the boundaries between parts."
http://en.wikipedia.org/wiki/Mereotopology

Measurement, Mereotopology, and the Nature of Nature:

‎"
To summarize: the moment we take seriously the fact that all measurement is a form of abstraction, we find ourselves compelled to question the logical possibility of measurement in the context of general relativity. Nowhere in that theory can we find the projective and mereotopological relations necessary to give meaning to measurement. On the other hand, bimetric theories become attractive both for their logical characteristics and their potential empirical consequences. Entire avenues are opened up for the possible reconciliation of macro and micro physics. If we elect to do this in the context of a Whiteheadian theory of nature, extension and abstraction, then we will additionally connect these physical theories within a mereotopological framework that is itself intimately connected to research in the area of general spatial reasoning. If we do not choose to do this in a Whiteheadian theory of nature, then we are obliged to say how the extensive deliverances of concrete experience link up with our abstractive processes of measurement to tell us what this “other” nature might be, and how this “other” nature can nevertheless be “really Real” despite the fact that our only access to it is through an abstract process of our own making. It should surprise no one to learn that I prefer the Whiteheadian solution."
"That is, the circular boundaries of the Venn circles can be construed as those of “potentialized” objects in the process of absorbing their spatiotemporal neighborhoods. Since the event potentials and object potentials coincide, potential instantiations of law can be said to reside “inside” the objects, and can thus be regarded as functions of their internal rules or “object syntaxes”.

Objects thus become syntactic operators, and events become intersections of nomological syntax in the common value of an observable state parameter, position.

The circle corresponding to the new event represents an attribute consisting of all associated nomological relationships appropriate to the nature of the interaction including conserved aggregates, and forms a pointwise (statewise) “syntactic covering” for all subsequent potentials.
...
The telic-recursive cross-refinement of syntax and content is implicit in the “seed” of Γ-grammar, the MU form, which embodies the potential for perfect complementarity of syntax and state, law and matter.

Since this potential can only be specifically realized through the infocognitive binding of telesis, and localized telic binding is freely and independently effected by localized, mutually decoherent telic operators, deviations from perfect complementarity are ubiquitous. SCSPL evolution, which can be viewed as an attempt to help this complementarity emerge from its potential status in MU, incorporates a global (syntactic) invariant that works to minimize the total deviation from perfect complementarity of syntax and state as syntactic operators freely and independently bind telesis.

This primary SCSPL invariant, the Telic Principle, takes the form of a selection function with a quantitative parameter, generalized utility, related to the deviation. The Telic Principle can be regarded as the primary component of SCSPL syntax…the spatiotemporally distributed self-selective “choice to exist” coinciding with MU.
...
SCSPL incorporates the concepts of syntactic stratification and syntactic distribution. For example, because the laws of mathematics everywhere apply with respect to the laws of physics, the former distribute over the latter in the syntactic sense. Thus, where the laws of mathematics and physics are denoted by S1=LMS and S2 respectively, S1 distributes over S2, i.e. forms a syntactic covering for S2.

Essentially, this means that the laws of mathematics are everywhere a required syntactic component of the language of physics. With S2 is associated an SCSPL “sublanguage” called LO (with a letter O subscript). LO constitutes the world of perception, the classical objective universe of sense data traditionally studied by science. LO is contained in the telic-recursive, pre-informational phase of SCSPL, LS, which encompasses the cross-refinement of LO syntax and LO content from the pre-infocognitive aspect of SCSPL. The part of SCSPL grammar confined to LO incorporates certain restrictions to which LS is not subject; e.g., the grammatical portion of LO (S2) is fixed, distributed and supposedly continuous, while that of LS can also be mutable, local and discrete…in a word, telic.

Γ grammar is the generative grammar of SCSPL = (LS⊃LO). Γ grammar is unlike an ordinary grammar in that its processors, products and productions coincide and are mutually formed by telic recursion. Syntax and state, loosely analogous to form and content (or productions and products), are mutually refined from telesis through telic recursion by infocognitive processors. Production rules include the Telic Principle, distributed elements of syntax formed in the primary phase of telic recursion, and more or less polymorphic telons formed by agent-level telors. The corresponding modes of production are global telic recursion, informational recursion by distributed syntax, and local telic recursion.

The “words” produced by Γ grammar are not strings of symbols, but LO spatial relationships among parallel processors that can read and write to each other’s states. In effect, the states of its processors are roughly analogous to the symbols and strings of an ordinary language. The processors of Γ grammar thus function not only as transducers but as symbolic placeholders for observables and values, while their external states correspond to products and their state transitions realize the productions of the grammar. In other words, the states and state transitions of the processors of Γ grammar comprise a representation of Γ grammar, rendering SCSPL a dynamic self-modeling language or “interactive self-simulation”.
...
Γ grammar generates SCSPL according to the utility of its sentient processors, including the self-utility of Γ and the utility of its LO relations to telors in A. Γ and A generate telons on the global and local level respectively; thus, they must be capable of recognizing and maximizing the selection parameter υ (in the case of human telors, for example, this requires the QPS (qualio-perceptual syntax) and ETS (emo-telic syntax) components of the HCS (Human Cognitive-Perceptual Syntax)).

As such, they are responsible for telic recursion and may be regarded as the “generators” of Γ grammar, while the set Q of elementary physical objects are freely and competitively acquired by telons and thus occupy an ontologically secondary position.

Γ grammar is conspansive. Non-global processors alternate between the generation and selective actualization of possible productions, and thus between the generative and selective (inner expansive and requantizative) phases of conspansion.

The selective phase of an operator coincides with interactive mutual-acquisition events, while the generative phase coincides with the generation and selective actualization of possible productions through hological multiplexing. In conjunction with extended spatiotemporal superposition, conspansion provides the means of local (telic and informational) recursion.
...
It is instructive to experiment with the various constructions that may be placed on LS and LO. For example, one can think of LS as “L-sim”, reflecting its self-simulative, telic-recursive aspect, and of LO as “L-out”, the output of this self-simulation. One can associate LO with observable states and distributed-deterministic state-transition syntax, and LS with the metasyntactic Telic Principle.

One can even think of LS and LO as respectively internal and (partially) external to SCSPL syntactic operators, and thus as loosely correspondent to the subjective and objective aspects of reality. Where LS and LO are associated with the coherent inner expansion and decoherent requantization phases of conspansion, so then are subjective and objective reality, simulation and output, “wave and particle”.

In other words, the subjective-objective distinction, along with complementarity, can be viewed as functions of conspansive duality.
...
Where space and time correspond to information and generalized cognition respectively, and where information and cognition are logically entwined in infocognitive SCSPL syntactic operators intersecting in states and state-transition events, space and time are entwined in a conspansive event-lattice connected by syntax and evolving through mutual absorption events among syntactic operators, symmetric instances of generalized observation influenced by telic recursion.

Thus, time is not “fed into” the explanation of existence, but is a function of conspansive, telic-recursive SCSPL grammar.
...
To see how information can be beneficially reduced when all but information is uninformative by definition, one need merely recognize that information is not a stand-alone proposition; it is never found apart from syntax. Indeed, it is only by syntactic acquisition that anything is ever “found” at all.

That which is recognizable only as syntactic content requires syntactic containment, becoming meaningful only as acquired by a syntactic operator able to sustain its relational structure; without attributive transduction, a bit of information has nothing to quantify.

This implies that information can be generalized in terms of “what it has in common with syntax”, namely the syndiffeonic relationship between information and syntax."
http://www.megafoundation.org/CTMU/Articles/Langan_CTMU_092902.pdf

Subjective and Objective:

"Recent, important developments in mathematical physics that are closely related to both mathematics and quantum physics have been considered as a strong indication of the possibility of a `fusion between mathematics and theoretical physics'; this viewpoint emerges from current results obtained at IHES in Paris, France, the international institute that has formerly served well the algebraic geometry and category theory community during Alexander Grothendieck's tenure at this institute. Brief excerpts of published reports by two established mathematicians, one from France and the other from UK, are presented next together with the 2008 announcement of the Crafoord prize in mathematics for recent results obtained at IHES and in the US in this `fusion area' between mathematics and theoretical physics (quantum theory and AQFT). Time will tell if this `fusion' trend will be followed by many more mathematicians and/or theoretical physicists elsewhere, even though a precedent already exists in the application of non-commutative geometry to SUSY extension in modern physics that was initiated by Professor A. Connes.


Pierre Cartier : ``On the Fusion of Mathematics and Theoretical Physics at IHES''

A verbatim quote from : ``The Evolution of Concepts of Space and Symmetry- A Mad Day's Work: From Grothendieck to Connes and Kontsevich*:''

``...I am in no way forgetting the facilities for work provided by the Institut des Hautes Études Scientifiques (IHES) for so many years, particularly the constantly renewed opportunities for meetings and exchanges. While there have been some difficult times, there is no point in dwelling on them. One of the great virtues of the institute was that it erected no barriers between mathematics and theoretical physics. There has always been a great deal of interpenetration of these two areas of interest, which has only increased over time. From the very beginning Louis Michel was one of the bridges due to his devotion to group theory. At present, when the scientific outlook has changed so greatly over the past forty years, the fusion seems natural and no one wonders whether Connes or Kontsevich are physicists or mathematicians. I moved between the two fields for a long time when to do so was to run counter to the current trends, and I welcome the present synthesis. Alexander Grothendieck dominated the first ten years of the institute, and I hope no one will forget that. I knew him well during the 50s and 60s, especially through Bourbaki, but we were never together at the institute, he left it in September 1970 and I arrived in July 1971. Grothendieck did not derive his inspiration from physics and its mathematical problems. Not that his mind was incapable of grasping this areahe had thought about it secretly before 1967, but the moral principles that he adhered to relegate physics to the outer darkness, especially after Hiroshima. It is surprising that some of Grothendieck's most fertile ideas regarding the nature of space and symmetries have become naturally wed to the new directions in modern physics. ''

S. Majid: On the Relationship between Mathematics and Physics:

In ref. [7], S. Majid presents the following `thesis' : ``(roughly speaking) physics polarises down the middle into two parts, one which represents the other, but that the latter equally represents the former, i.e. the two should be treated on an equal footing. The starting point is that Nature after all does not know or care what mathematics is already in textbooks. Therefore the quest for the ultimate theory may well entail, probably does entail, inventing entirely new mathematics in the process. In other words, at least at some intuitive level, a theoretical physicist also has to be a pure mathematician. Then one can phrase the question `what is the ultimate theory of physics ?' in the form `in the tableau of all mathematical concepts past present and future, is there some constrained surface or subset which is called physics ?' Is there an equation for physics itself as a subset of mathematics? I believe there is and if it were to be found it would be called the ultimate theory of physics. Moreover, I believe that it can be found and that it has a lot to do with what is different about the way a physicist looks at the world compared to a mathematician...We can then try to elevate the idea to a more general principle of representation-theoretic self-duality, that a fundamental theory of physics is incomplete unless such a role-reversal is possible. We can go further and hope to fully determine the (supposed) structure of fundamental laws of nature among all mathematical structures by this self-duality condition. Such duality considerations are certainly evident in some form in the context of quantum theory and gravity. The situation is summarised to the left in the following diagram. For example, Lie groups provide the simplest examples of Riemannian geometry, while the representations of similar Lie groups provide the quantum numbers of elementary particles in quantum theory. Thus, both quantum theory and non-Euclidean geometry are needed for a self-dual picture. Hopf algebras (quantum groups) precisely serve to unify these mutually dual structures.'' (The reader may also wish to see the original document on line.)

** Maxim Kontsevich received the Crafoord Prize in 2008: ``Maxim Kontsevich, Daniel Iagolnitzer Prize, Prix Henri Poincaré Prize in 1997, Fields Medal in 1998, member of the Academy of Sciences in Paris, is a French mathematician of Russian origin and is a permament professor at IHES (since 1995). He belongs to a new generation of mathematicians who have been able to integrate in their area of work aspects of quantum theory, opening up radically new perspectives. On the mathematical side, he drew on the systematic use of known algebraic structure deformations and on the introduction of new ones, such as the `triangulated categories' that turned out to be relevant in many other areas, with no obvious link, such as image processing.''

`The Crafoord Prize in astronomy and mathematics, biosciences, geosciences or polyarthritis research is awarded by the Royal Swedish Academy of Sciences annually according to a rotating scheme. The prize sum of USD 500,000 makes the Crafoord one of the world' s largest scientific prizes'.

``Mathematics and astrophysics were in the limelight this year, with the joint award of the Mathematics Prize to Maxim Kontsevitch, (French mathematician), and Edward Witten, (US theoretical physicist), `for their important contributions to mathematics inspired by modern theoretical physics', and the award of the Astronomy Prize to Rashid Alievich Sunyaev (astrophysicist) `for his decisive contributions to high-energy astrophysics and cosmology'.''


Majid, Shahn (2007) Algebraic Approach to Quantum Gravity: relative realism.

"In the first of three articles, we review the philosophical foundations of an approach to quantum gravity based on a principle of representation-theoretic duality and a vaguely Kantian-Buddist perspective on the nature of physical reality which I have called `relative realism'. Central to this is a novel answer to the Plato's cave problem in which both the world outside the cave and the `set of possible shadow patterns' in the cave have equal status. We explain the notion of constructions and `co'constructions in this context and how quantum groups arise naturally as a microcosm for the unification of quantum theory and gravity. More generally, reality is `created' by choices made and forgotten that constrain our thinking much as mathematical structures have a reality created by a choice of axioms, but the possible choices are not arbitary and are themselves elements of a higher-level of reality. In this way the factual `hardness' of science is not lost while at the same time the observer is an equal partner in the process. We argue that the `ultimate laws' of physics are then no more than the rules of looking at the world in a certain self-dual way, or conversely that going to deeper theories of physics is a matter of letting go of more and more assumptions. We show how this new philosophical foundation for quantum gravity leads to a self-dual and fractal like structure that informs and motivates the concrete research reviewed in parts II,III. Our position also provides a kind of explanation of why things are quantized and why there is gravity in the first place, and possibly why there is a cosmological constant.

Keywords: quantum gravity, Plato's cave, Kant, Buddism, physical reality, quantum logic, quantum group, monoidal category, T-duality, Fourier transform, child development

Subjects: General Issues: Laws of Nature
Specific Sciences: Physics: Cosmology
General Issues: Realism/Anti-realism
...
This ‘reversal of point of view’ is an example of observer-observed duality. In a nutshell, while Plato’s conclusion was that his cave was an allegory for a pure reality of which we can only ever see a shadow, our conclusion is exactly the opposite, that there is no fundamental difference between ‘real’ in this platonic sense and the world of shadows since one could equally well consider X as ˆˆX , in other words as ‘shadows’ of what we previously thought as shadows, and the latter as ‘real’ in the platonic sense.

The key question is, is such a reversed point of view physics or is it just a mathematical curiosity? The physical world would have to have the feature that the dual structures ˆX would also have to be identifiable as something reasonable and part of it. This could be achieved for example by first of all convincing the prisoners to take ˆX seriously, to think about its structure, to take on the view that x was a representation of this structure. Over time they might grudgingly allow that both X and ˆX should be considered ‘real’ and that each represents the other. They would arrive in this way at a self-dual position as to what was ‘real’, namely X × ˆX . This is often the simplest but not the only way to reach a self-dual picture. We take this need for a self-dual overall picture as a fundamental postulate for physics, which we called [12] the principle of representation-theoretic self-duality:

(Postulate) a fundamental theory of physics is incomplete unless self-dual in the sense that such a role-reversal is possible. If a phenomenon is physically possible then so is its observer-observed reversed one.

One can also say this more dynamically: as physics improves its structures tend to become self dual in this sense. This has in my view the same status as the second law of thermodynamics: it happens tautologically because of the way we think about things. In the case of thermodynamics it is the very concept of probability which builds in a time asymmetry (what we can predict given what we know now) and the way that we categorise states that causes entropy to increase (when we consider many outcomes ‘the same’ then that situation has a higher entropy by definition and is also more likely). In the case of the self-duality principle the reason is that in physics one again has the idea that something exists and one is representing it by experiments. But experimenters tend to think that the set ˆX of experiments is the ‘real’ thing and that a theoretical concept is ultimately nothing but a representation of the experimental outcomes. The two points of view are forever in conflict until they agree that both exist and one represents the other.
...
3. Relative Realism

What the bicrossproduct models illustrate is the following general proposition:

there is no absolute physical reality as usually considered but rather it is we in order to make sense of the Universe, who artificially impose a division into ‘abstract laws of nature’ (in the form of abstract structures deemed to exist) and ‘measurements’ made or experiments done to illustrate them. I believe this division is necessary but it is also arbitrary in that the true picture of physical reality should be formulated in such a way as to be independent of this division."

Bicrossproduct Hopf Algebras:
http://tinyurl.com/25hqwca

The Duality of the Universe:

It is proposed that the physical universe is an instance of a mathematical structure which possesses a dual structure, and that this dual structure is the collection of all possible knowledge of the physical universe. In turn, the physical universe is then the dual space of the latter.
http://philsci-archive.pitt.edu/archive/00004039/01/Dualiverse.pdf

"A Self-Explaining Universe?

Next week I'm attending a symposium at the Royal Society on the origin of time and the universe, in honour of Michael Heller's 2008 Templeton Prize. There'll be talks by Michael himself, John Barrow, Andreas Doring, Paul Tod, and Shahn Majid. Majid proposed some years ago that the mathematical concept of self-duality can be used to provide an ultimate explanation of the universe in terms of the universe itself; a type of self-explanation. Michael Heller wrote a lovely paper on Majid's ideas, (Algebraic self-daulity as the 'ultimate explanation', Foundations of Science, Vol. 9, pp369-385) in 2004, and I've been reviewing this ahead of next week's symposium.

Majid illustrates his idea with the notion of a self-dual bicrossproduct Hopf algebra, (although he doesn't believe that this specific mathematical structure is a candidate for a theory of everything). To understand what this is, we first need to understand what a Hopf algebra is."
http://mccabism.blogspot.com/2008_04_01_archive.html

"Rather, I think that this deepest and most long-standing of all problems in fundamental physics still needs a revolutionary new idea or two for which we are still grasping. More revolutionary even than time-reversal. Far more revolutionary and imaginative than string theory. In this post I’ll take a personal shot at an idea — a new kind of duality principle that I think might ultimately relate gravity and information.

The idea that gravity and information should be intimately related, or if you like that information is not just a semantic construct but has a physical aspect is not new. It goes back at least some decades to works of Beckenstein and Hawking showing that black holes radiate energy due to quantum effects. As a result it was found that black holes could be viewed has having a certain entropy, proportional to the surface area of the black hole. Lets add to this a `black-holes no-hair theorem’ which says that black holes, within General Relativity, have no structure other than their total mass, spin and charge (in fact, surprisingly like an elementary particle).

What this means is that when something falls into a black hole all of its finer information content is lost forever. This is actually a bit misleading because in our perception of time far from the black hole the object never actually falls in but hovers forever at the edge (the event horizon) of the black hole. But lets gloss over that technicality. Simply put, then, black holes gobble up information and turn it into raw mass, spin and charge. This in turn suggests a kind of interplay between mass or gravity, and information. These are classical gravity or quantum particle arguments, not quantum gravity, but a true theory of quantum gravity should surely explain all this much more deeply."


Reflections on a Self-Representing Universe:

"To put some flesh on this, the kind of duality I am talking about in this post is typified in physics by position-momentum or wave-particle duality. Basically, the structure of addition in flat space X is represented by waves f. Here f is expressed numerically as the momentum of the wave. But the allowed f themselves form a space X*, called ‘momentum space’. The key to the revolution of quantum mechanics was to think of X and X* as equally real, allowing Heisenberg to write down his famous Heisenberg commutation relations between position and momentum. They key was to stop thinking of waves as mere representations of a geometrical reality X but as elements in their own right of an equally real X*. The idea that physics should be position-momentum symmetric was proposed by the philosopher Max Born around the birth of quantum mechanics and is called Born Reciprocity. This in turn goes back (probably) to ideas of Ernst Mach.

What I am going to argue is that such a principle also seems to hold and is key to quantum gravity at a wider and more general level. We have seen this for quantum gravity in three spacetime dimensions last week but the principle as I have explained above is even deeper."

http://www.cambridgeblog.org/2008/12/reflections-on-a-self-representing-universe/

"Thus, conspansive duality relates two complementary views of the universe, one based on the external (relative) states of a set of objects, and one based on the internal structures and dynamics of objects considered as language processors. The former, which depicts the universe as it is usually understood in physics and cosmology, is called ERSU, short for Expanding Rubber Sheet Universe, while the latter is called USRE (ERSU spelled backwards), short for Universe as a Self-Representational Entity. Simplistically, ERSU is like a set, specifically a topological-geometric point set, while USRE is like a self-descriptive nomological language. Whereas ERSU expands relative to the invariant sizes of its contents, USRE “conspands”, holding the size of the universe invariant while allowing object sizes and time scales to shrink in mutual proportion, thus preserving general covariance." - Langan, PCID, 2002

Self-Representing Information Processing Systems:

When to Take an Observation?

• A utility maximizing agent will take an observation when the value of information is positive

• As time since last observation increases agent becomes more uncertain of what will occur

• Utility distribution becomes more spread out

• Take observation when predicted utility gain from taking observation exceeds predicted utility cost of observation

• Ask question for which answer has highest expected utility

Communication

• Learning is faster when agents exchange information

• Communicating agents exchange messages
– Agent 1 asks a question of Agent 2
– Agent 1 takes an observation of Agent 2’s state

• Agents can learn each other’s representations
– Efficient communication is the expression of the difference between my knowledge and yours in the language of your representation
– Hypothesis: communication evolves toward greater efficiency

• Communication respects laws of physics
– Intrinsic randomness
– Local reversibility
– Prevents “freezing” at locally optimal representations

Theoretical Framework for Learnable Universe

• The framework described here does not depend on the specifics of the laws of physics in our universe

• Requirements for dynamics (Stapp’s terminology)
– Shrödinger evolution between observations
– Dirac probabilities to answer questions posed to Nature
– Heisenberg process evolves toward

» Choose to observe using value of information
» Choose what to observe using maximum expected utility

• Requirements for structure of Hamiltonian
– Infinite dimensional
– Can be approximated by sequence of finite dimensional models
– Self-similar structure

Summary

• Conscious agents construct representations

• Conscious agents learn better representations over time

• Common mathematics and algorithms for
– Simulating physical systems
– Learning complex representations

» Many parameters
» High degree of conditional independence
» High degree of self-similarity

• As physical system evolves to minimize free energy its conscious subsystems evolve to construct better representations of the system they inhabit
– Maximum physical entropy corresponds to maximum simultaneous knowledge of (UT,ET)

"Notes on Self-Representing, and Other, Information Structures:

Is The Universe Deterministic?

Some people spend many hours in their youth wondering if the universe is deterministic. Some of these people end their pondering with the thought that, if we believe the universe is deterministic, then we behave differently than if we believe it isn’t, or than if we had never thought about the question in the first place.

It is well known in computation theory that there does not exist a Turing Machine M that can predict, for every Turing Machine M′ and every input i to M, whether M′ will eventually halt. Yet the reason is not that the instructions in M or M′ are vaguely defined. In keeping with one of the themes of these notes, we may say, it is as though there isn’t enough room in the space of Turing Machines for Machines such as M.

Thus, even if God does not play dice, there may be non-determinism in the universe, simply because there are not enough bits for the left hand always to represent what the right is doing."


"I’ve traced out what seemed to me an interesting path. First I stumbled upon Bart Jacob’s book Introduction to Coalgebra: Towards Mathematics of States and Observations. This I’d thoroughly recommend. Let me give you some nuggets from it:

The duality with algebras forms a source of inspiration and of opposition: there is a “hate-love” relationship between algebra and coalgebra. (p. v)

As already mentioned, ultimately, stripped to its bare minimum, a programming language involves both a coalgebra and an algebra. A program is an element of the algebra that arises (as so-called initial algebra) from the programming language that is being used. Each language construct corresponds to certain dynamics, captured via a coalgebra. The program’s behaviour is thus described by a coalgebra acting on the state space of the computer. (p. v)"

http://golem.ph.utexas.edu/category/2008/11/coalgebraically_thinking.html

"Any set that can be constructed by adding elements to the space between two brackets can be defined by restriction on the set of all possible sets. Restriction involves the Venn-like superposition of constraints that are subtractive in nature; thus, it is like a subtractive color process involving the stacking of filters. Elements, on the other hand, are additive, and the process of constructing sets is thus additive; it is like an additive color process involving the illumination of the color elements of pixels in a color monitor. CF duality simply asserts the general equivalence of these two kinds of process with respect to logico-geometric reality.

CF duality captures the temporal ramifications of TD duality, relating geometric operations on point sets to logical operations on predicates. Essentially, CF duality says that any geometric state or continuous transformation is equivalent to an operation involving the mutual “filtration” of intersecting hological state-potentials. States and objects, instead of being constructed from the object level upward, can be regarded as filtrative refinements of general, internally unspecified higher-order relations.

CF duality is necessary to show how a universe can be “zero-sum”; without it, there is no way to refine the objective requisites of constructive processes “from nothingness”. In CTMU cosmogony, “nothingness” is informationally defined as zero constraint or pure freedom (unbound telesis or UBT), and the apparent construction of the universe is explained as a self-restriction of this potential. In a realm of unbound ontological potential, defining a constraint is not as simple as merely writing it down; because constraints act restrictively on content, constraint and content must be defined simultaneously in a unified syntax-state relationship." - Langan, 2002, PCID, pg. 26-27


"By way of introduction, let us review certain fundamentals of computational linguistics. A grammar consists of a set of rules according to which a body of text, e.g. a paragraph from a book for children, can be derived from a set of compact syntactic structures. As every student learns, the sentence “Suzy loves Jamie” can be derived from the compact syntactic structure (subject, verb, direct object), which can in turn be derived from the unary syntactic structure S (standing for Sentence or Start). Broken down into their smallest independent units, these rules of derivation are called production rules. Productions may be concatenated in the familiar parse trees encountered in grammar school, with some of them contingent on others. When reversed, they are called reductions.

The process of grammatically producing text from syntax also works backwards; given a body of meaningful text, one can attempt to reduce it to syntax. A parser is an algorithm or computer program that decides whether a body of text can be derived from a specific grammar and lexicon assigning words to parts of speech. If so, then the text is classified as a sentence with respect to the grammar and lexicon, and the parser produces a representation of one or more ways in which it can be derived by substitution ... for example, a set of parse trees.

Where the proper syntax is not initially known— e.g. where one is dealing with a body of text from some strange language with an unknown grammar—one may also consider ways of deducing the grammar from the text. In this case, however, one has no rules of production, and thus cannot simply reverse them to obtain rules of reduction.

Instead, one needs a generalized metagrammar consisting of rules by which the appropriate rules of grammar, and their accompanying syntactic structures, can be derived ad hoc from the input text. Whereas a grammar derives a language from syntax, a metagrammar derives a grammar from text. Essentially, the metagrammar examines part of the text, guesses about the grammatical rule(s) that produced it, and then modifies its guesses on the basis of what it subsequently encounters in the text. Where these guesses are based on experience—for example, on exposure to a sample of the input language on which the metagrammar has initially been “tutored”—they are subject to errors based on the limited nature of the sample.

Therefore, the metagrammar must refine its guesses as errors arise and its experience grows. Because its guesses adapt to what it finds in the text, it can be described as an adaptive grammar. The words, phrases, sentences and punctuation symbols in a meaningful sentence combine to form syntactic relations or predicates. Some predicates involve more or less general or specific relations among words; others involve relations among phrases and their boundaries or phrase markers. These relations, which mirror those parts or aspects of the mental or objective world described by the sentence, combine in turn to form a network of references among various components of the sentence. Our learned and innate understanding of this syntactic network is what enables us to comprehend the text, and on that basis, to draw appropriate inferences and display appropriate behaviors.
...
On being fed a strange body of text as input, such a metagrammatically-programmed computer attempts to derive an adaptive grammar with respect to which the text can be parsed en route to extracting its meaning.

This grammar is adaptively derived as pieces of input are examined and tentatively assigned, or bound, to provisional syntactic classes according to whether or not the pieces possess certain predicates, with predicates and classes subject to adaptation as the input is progressively scanned.
...
Works such as A Clockwork Orange play on this human ability to work at discerning meaning by preceding, local, and following context, becoming manageable to understand with effort in what we have called linguistic time-space (q.v. [Jackson 2000b]).

Moreover, with only a bit of thought experiment, taken outside of natural language parsing and into pure logic, we believe that this formalism may have utility in traversing self-negating logic constructs as presented by [Langan]."

On Einstein's Razor: Telesis-Driven Introduction of Complexityinto Apparently Sufficiently Non-Complex Linguistic Systems

"The notion that a linguistic system that is powerful enough to accept any acceptable language but insufficiently complex to meet specific goals or needs is explored. I nominate Chomsky’s generative grammar formalism as the least complex formalism required to describe all language, but show how without the addition of further complexity, little can be said about the formalism itself. I then demonstrate how the O(n) parsing of pseudoknots, a previously difficult to solve problem, becomes tractable by the more complex §-Calculus, and finally close with a falsifiable
hypothesis with implications in epistemological complexity.

Keywords: parsing, scientific inquiry, telesis, necessary complexity"
...
Ultimately, in an attempt to avoid this axiomatic approach, we decide as scientific inquirers to allow the introduction of more complexity, either by introduction into the formalism, or by encasing the original system in the safe cocoon of a meta-model. Meta-models cannot be used to defer complexity; that would be akin to formalized prestidigitation. We need to add complexity in order to meet our goal of avoiding unnecessary tautology. This need is goal driven, and the goal in this case is simply stated: avoid tautology unless absolutely required to resort to it."

"Graph transformation, or Graph rewriting, concerns the technique of creating a new graph out of an original graph using some automatic machine. It has numerous applications, ranging from software verification to layout algorithms.

Graph transformations can be used as a computation abstraction. The basic idea is that the state of a computation can be represented as a graph, further steps in that computation can then be represented as transformation rules on that graph. Such rules consist of an original graph, which is to be matched to a subgraph in the complete state, and a replacing graph, which will replace the matched subgraph.

Formally, a graph rewriting system consists of a set of graph rewrite rules of the form, with L being called pattern graph (or left-hand side) and R being called replacement graph (or right-hand side of the rule). A graph rewrite rule is applied to the host graph by searching for an occurrence of the pattern graph (pattern matching, thus solving the subgraph isomorphism problem) and by replacing the found occurrence by an instance of the replacement graph.

Sometimes graph grammar is used as a synonym for graph rewriting system, especially in the context of formal languages; the different wording is used to emphasize the goal of enumerating all graphs from some starting graph, i.e. describing a graph language - instead of transforming a given state (host graph) into a new state."
http://en.wikipedia.org/wiki/Graph_rewriting

Grammar Formalisms Viewed as Evolving Algebras:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.40.5187

Relational Grammar for Computational Psycholinguistics:
http://static.vodenski.com/django-sites/vodenskicom/cms_page_media/5/paper.pdf

"Garp: Graph Grammars for Concurrent Programming:

Several research projects are investigating parallel processing languages where dynamic process topologies can be constructed. Failure to impose abstractions on interprocess connection patterns can result in arbitrary interconnection topologies that are difficult to understand. We propose the use of a graph-grammar based formalism to control the complexities arising from trying to program such dynamic networks."
http://www.springerlink.com/content/y8n46m8406m27j71/

Context-free Hypergraph Grammars.: Node and Hyperedge Rewriting with an Application to Petri Nets. Dissertation.
http://books.google.com/books?id=g915HdmZJe8C&output=html_text&source=gbs_navlinks_s

Attributed Context-Free Hypergraph Grammars:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.51.2458

Normal Forms for Context-Free Node-Rewriting Hypergraph Grammars:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.33.1315

A Computational Path to the Nilpotent Dirac Equation

What is the universal nilpotent computational rewrite system?

"It differs from traditional rewrite systems (of computational semantic language description with a fixed or finite alphabet) in that the rewrite rules allow new symbols to be added to the initial alphabet. In fact D&R start with just one symbol representing "nothing" and two fundamental rules; create a process which adds new symbols and conserve a process that examines the effect of any new symbol on those that currently exist to ensure “a zero sum” again. In this way at each step a new sub-alphabet of an infinite universal alphabet is created. However the system may also be implemented in an iterative way, so that a sequence of mathematical properties is required of the emerging sub-alphabets. D&R show that one such sequential iterative path proceeds from nothing (corresponding to the mathematical condition nilpotent) through conjugation, complexification, and dimensionalization to a stage in which no fundamentally new symbol is needed. At this point the alphabet is congruent with the nilpotent generalization of Dirac’s famous quantum mechanical equation showing that it defines the quantum mechanical “machine order code” for all further (universal) computation corresponding to the infinite universal alphabet ++ . This rewrite system with its nilpotent bootstrap methodology from “nothing/the empty set” thus defines the requirement for universal quantum computation to constitute a semantic model of computation with a universal grammar. ++ since a new symbol can stand for itself, a sub-alphabet or the infinite universal alphabet, the universal nilpotent rewrite system may thus rewrite itself, ontologically at a higher (hierarchical) level of quantum physical structure."
http://www.bcs.org/upload/pdf/nucrs.pdf
http://arxiv.org/pdf/cs.OH/0209026

"Rowlands (U. of Liverpool, UK) proposes a new approach to the investigation of physics in which a zero totality is used to create a "universal rewrite system," which then allows the restructuring of mathematics without first assuming the number system or discreteness. He then derives by induction the foundational components of physics and shows them as analogous to the mathematical structure. He next seeks to demonstrate that the most convenient packaging of the of the mathematical structure is the one that provides the shortest route to zero totality, which he argues also leads to the fundamental equation that drives the whole of physics. Much of the rest of the text is spent working out the consequences of this in order to generate the results considered foundational to physics but two chapters are included that are devoted to showing that these structures are also applicable (in a fractal sense) to biological and other large-scale systems."
http://books.google.com/books?id=cOnjDfQQX0UC&pg=PA558&lpg=PA558&dq=nilpotent+rewrite+system&source=bl&ots=mw9xHVlt4O&sig=8Kc0y5XCm29KgM3BgDjMLcvoSVU&hl=en

Hypersets and Hology:

"Hology is a logical analogue of holography characterizing the most general relationship between reality and its contents. It is a form of self-similarity whereby the overall structure of the universe is everywhere distributed within it as accepting and transductive syntax, resulting in a homogeneous syntactic medium.

By the Principle of Linguistic Reducibility, reality is a language. Because it is self-contained with respect to processing as well as configuration, it is a Self-Configuring Self-Processing Language or SCSPL whose general spatiotemporal structure is hologically replicated everywhere within it as self-transductive syntax. This reduces the generative phase of reality, including physical cosmogony, to the generative grammar of SCSPL."
http://megafoundation.org/Teleologic/

Generation of fractals from incursive automata, digital diffusion and wave equation systems:

"This paper describes modelling tools for formal systems design in the fields of information and physical systems. The concept and method of incursion and hyperincursion are first applied to the fractal machine, an hyperincursive cellular automata with sequential computations with exclusive or where time plays a central role. Simulations show the generation of fractal patterns. The computation is incursive, for inclusive recursion, in the sense that an automaton is computed at future time t + 1 as a function of its neighbouring automata at the present and/or past time steps but also at future time t + 1. The hyperincursion is an incursion when several values can be generated for each time step. External incursive inputs cannot be transformed to recursion. This is really a practical example of the final cause of Aristotle. Internal incursive inputs defined at the future time can be transformed to recursive inputs by self-reference defining then a self-referential system. A particular case of self-reference with the fractal machine shows a non deterministic hyperincursive field. The concepts of incursion and hyperincursion can be related to the theory of hypersets where a set includes itself. Secondly, the incursion is applied to generate fractals with different scaling symmetries. This is used to generate the same fractal at different scales like the box counting method for computing a fractal dimension. The simulation of fractals with an initial condition given by pictures is shown to be a process similar to a hologram. Interference of the pictures with some symmetry gives rise to complex patterns. This method is also used to generate fractal interlacing. Thirdly, it is shown that fractals can also be generated from digital diffusion and wave equations, that is to say from the modulo N of their finite difference equations with integer coefficients.

Keywords: Computer Simulation, Fractals, Information Systems, Mathematics, Models, Biological, Philosophy, Physics"
http://pubget.com/paper/9231908

"This entry is about two kinds of circularity: object circularity, where an object is taken to be part of itself in some sense; and definition circularity, where a collection is defined in terms of itself. Instances of these two kinds of circularity are sometimes problematic, and sometimes not. We are primarily interested in object circularity in this entry, especially instances which look problematic when one tries to model them in set theory. But we shall also discuss circular definitions.

The term non-wellfounded set refers to sets which contain themselves as members, and more generally which are part of an infinite sequence of sets each term of which is an element of the preceding set. So they exhibit object circularity in a blatant way. Discussion of such sets is very old in the history of set theory, but non-wellfounded sets are ruled out of Zermelo-Fraenkel set theory (the standard theory) due to the Foundation Axiom (FA). As it happens, there are alternatives to this axiom FA. This entry is especially concerned with one of them, an axiom first formulated by Marco Forti and Furio Honsell in a 1983 paper. It is now standard to call this principle the Anti-Foundation Axiom (AFA), following its treatment in an influential book written by Peter Aczel in 1988.

The attraction of using AFA is that it gives a set of tools for modeling circular phenomena of various sorts. These tools are connected to important circular definitions, as we shall see. We shall also be concerned with situating both the mathematics and the underlying intuitions in a broader picture, one derived from work in coalgebra. Incorporating concepts and results from category theory, coalgebra leads us to concepts such as corecursion and coinduction; these are in a sense duals to the more standard notions of recursion and induction.

The topic of this entry also has connections to work in game theory (the universal Harsanyi type spaces), semantics (especially situation-theoretic accounts, or others where a “world” is allowed to be part of itself), fractals sets and other self-similar sets, the analysis of recursion, category theory, and the philosophical side of set theory."
http://plato.stanford.edu/entries/nonwellfounded-set-theory/index.html

"So the iterative method is giving us the terms of the iteration sequence beginning with [0,1]. Finally, the iteration sequence beginning with [0,1] is a shrinking sequence of sets, and then by the way limits are calculated in X, the limit is exactly the intersection of the shrinking sequence."
http://plato.stanford.edu/entries/nonwellfounded-set-theory/modeling-circularity.html

"The CTMU seeks to answer such existential questions as “What came before the beginning?”, “Into what meta-space is our universe expanding?”, and “Our universe seems to be impossibly improbable. How did our life-supporting continuum manage to come into being in the face of impossible odds?” The CTMU postulates that,

(A) Our Universe Only Appears to Be Expanding
Our universe, instead of expanding, is shrinking uniformly in such a way that from the inside, it appears to be expanding. (This sidesteps the question. “If our universe is expanding, into what “meta-space” is it expanding?”)

(B) An Unbounded Sea of Possibilities
Our universe has evolved from a seed in an initial, “informational sea” of unbounded possibilities to its current state through a “filtration process” in which various possibilities and classes of possibilities have been eliminated. A minute subset of possibilities is eliminated every time a quantum wave function “collapses” to produce a single outcome. Before the “collapse” of the yy* probability function occurs, an entire envelope of possibilities exists; but once a measurement is made to determine the outcome, the yy* distribution collapses to a single, actual outcome. In other words, before the wave function collapses, there exists a seemingly infinite (or at least very large) number of outcomes that is consistent with that quantum-mechanical wave function. After the quantum-mechanical system is forced to “make a decision”, those seemingly infinite potentialities are eliminated. This is just like what happens to us when we have to choose among a number of possible courses of action.
Applied to the universe as a whole, this constitutes an infinitesimal reduction in the set of potential ways the universe can branch.

(C) The Telic Principle
One of the problems in cosmogony is the problem of the extreme improbability of finding a universe suitable for the evolution of intelligent life. One proposed answer is the Anthropic Principle. The Anthropic Principle assumes that an enormous number of parallel universes exist that differ minutely in the values of their various physical constants or initial conditions. Virtually all of these parallel universes are devoid of intelligent life, but we’re in one that is the incomparably rare exception. But we’re also in one of the only universes that harbors life that can ask such questions.
The CTMU posits instead a “Telic Principle”. The Telic Principle postulates that intelligence in our universe can retroactively influence quantum-mechanical outcomes at the myriad times that a wave-function collapses to a single outcome. This intelligence influences quantum-mechanical condensations in ways that are calculated to generate our universe. (Talk about “the fall of every sparrow”!) In other words, our universe, acting as its own architect, creates itself.

(D) Instant Communication of Quantum-Mechanical State Information
Once a quantum-mechanical wave function condenses to a single outcome (makes a decision), news of that outcome spreads at the speed of light. However, within that spreading sphere, quantum-mechanical wave functions can instantaneously exchange wave function information at great distances. "
http://www.iscid.org/boards/ubb-get_topic-f-6-t-000397-p-8.html

"Thus the repetitively alternative absorption and emission of force fields backs up an idea that submicroscopic deterministic quantum mechanics (Krasnoholovets, 2002a) is the origin for anticipatory processes revealed in electrodynamics and quantum mechanics by Dubois (2000a,b) who considered anticipation as an inner property of any quantum system, which should naturally be embedded in the system. Dubois constructed discrete forward and backward derivatives d ± / dt that thus represented the absorption/emission of force fields.

In summary, the lattice space is the space for wavefunction in the space for individual particles, while the lattice space is the space for gauge symmetry in the space in between the core particle and the gauge force field. The space structure is not absolute. It depends on mass-energy, coherence-decoherence of individual particles and fractality of force fields.

Krasnoholovets (2002b) studied the appearance of gravity as a contraction of the tessellation space due to the propagation of inertons around an object: The object’s inertons induce a mass field, i.e. distribute deformations of space resulting in its contraction, because by definition (Bounias and Krasnoholovets, 2003b) the notion of mass is associated with a local deformation of a tessellattice’s cell."

"This feature of contextuality concerning the attribution of properties in QM is also present in Bohm’s ontological interpretation of the theory by clearly putting forward that

“quantum properties cannot be said to belong to the observed system alone and, more generally, that such properties have no meaning apart from the total context which is relevant in any particular situation. In this case, this includes the overall experimental arrangement so that we can say that measurement is context dependent” (Bohm and Hiley 1993, p. 108).
...
Consequently, the said ‘objects’, being context-dependent, cannot be conceived of as ‘things in themselves’, as ‘absolute’ building blocks of reality (see also Pauri 2003).

Instead, they represent carriers of patterns or properties which arise in interaction with their experimental environment, or more generally, with the rest of the world; the nature of their existence depends on the context into which they are embedded and on the abstractions we are forced to make in any scientific discussion.

Hence, instead of picturing entities populating the mind-independent reality, they determine the possible manifestations of these entities within a concrete experimental context. In this sense, contextual quantum objects may be viewed as ‘constructed’, ‘phenomenal’ entities brought forward by the theory. They do not constitute, however, arbitrary ‘personal constructions’ of the human mind (as individualist constructivist consider) neither do they form products of a ‘social construction’ (as sociological constructivist assume).

By reflecting the inherent potentialities of a quantum entity with respect to a certain preselected experimental arrangement, the resulting contextual object may be thought of as a ‘construction’, as an abstraction-dependent existence, that presents nonetheless real structural aspects of the physical world (Karakostas 2004b)."
http://arxiv.org/ftp/arxiv/papers/0904/0904.2859.pdf

"But mathematics has its own problems. Whereas science suffers from the problems just described – those of indiscernability and induction, nonreplicability and subjectivity - mathematics suffers from undecidability. It therefore seems natural to ask whether there might be any other inherent weaknesses in the combined methodology of math and science. There are indeed. Known as the Lowenheim-Skolem theorem and the Duhem-Quine thesis, they are the respective stock-in-trade of disciplines called model theory and the philosophy of science (like any parent, philosophy always gets the last word). These weaknesses have to do with ambiguity…with the difficulty of telling whether a given theory applies to one thing or another, or whether one theory is “truer” than another with respect to what both theories purport to describe.

But before giving an account of Lowenheim-Skolem and Duhem-Quine, we need a brief introduction to model theory. Model theory is part of the logic of “formalized theories”, a branch of mathematics dealing rather self-referentially with the structure and interpretation of theories that have been couched in the symbolic notation of mathematical logic…that is, in the kind of mind-numbing chicken-scratches that everyone but a mathematician loves to hate. Since any worthwhile theory can be formalized, model theory is a sine qua non of meaningful theorization.

Let’s make this short and punchy. We start with propositional logic, which consists of nothing but tautological, always-true relationships among sentences represented by single variables. Then we move to predicate logic, which considers the content of these sentential variables…what the sentences actually say. In general, these sentences use symbols called quantifiers to assign attributes to variables semantically representing mathematical or real-world objects. Such assignments are called “predicates”. Next, we consider theories, which are complex predicates that break down into systems of related predicates; the universes of theories, which are the mathematical or real-world systems described by the theories; and the descriptive correspondences themselves, which are called interpretations. A model of a theory is any interpretation under which all of the theory’s statements are true. If we refer to a theory as an object language and to its referent as an object universe, the intervening model can only be described and validated in a metalanguage of the language-universe complex.

Though formulated in the mathematical and scientific realms respectively, Lowenheim-Skolem and Duhem-Quine can be thought of as opposite sides of the same model-theoretic coin. Lowenheim-Skolem says that a theory cannot in general distinguish between two different models; for example, any true theory about the numeric relationship of points on a continuous line segment can also be interpreted as a theory of the integers (counting numbers). On the other hand, Duhem-Quine says that two theories cannot in general be distinguished on the basis of any observation statement regarding the universe.

Just to get a rudimentary feel for the subject, let’s take a closer look at the Duhem-Quine Thesis. Observation statements, the raw data of science, are statements that can be proven true or false by observation or experiment. But observation is not independent of theory; an observation is always interpreted in some theoretical context. So an experiment in physics is not merely an observation, but the interpretation of an observation. This leads to the Duhem Thesis, which states that scientific observations and experiments cannot invalidate isolated hypotheses, but only whole sets of theoretical statements at once. This is because a theory T composed of various laws {Li}, i=1,2,3,… almost never entails an observation statement except in conjunction with various auxiliary hypotheses {Aj}, j=1,2,3,… . Thus, an observation statement at most disproves the complex {Li+Aj}."
http://www.megafoundation.org/CTMU/Articles/Theory.html

"On a note of forbearance, there has always been comfort in the belief that the standard hybrid empirical-mathematical methods of physics and cosmology will ultimately suffice to reveal the true heart of nature. However, there have been numerous signals that it may be time to try a new approach. With true believers undaunted by the (mathematically factual) explanatory limitations of the old methods, we must of course empathize; it is hard to question one’s prior investments when one has already invested all the faith that one has. But science and philosophy do not progress by regarding their past investments as ends in themselves; the object is always to preserve that which is valuable in the old methods while adjoining new methods that refine their meaning and extend their horizons. The new approach that we will be exploring in this paper, which might be colorfully rendered as “reality theory is wedded to language theory and they beget a synthesis”, has the advantage that it leaves the current picture of reality virtually intact. It merely creates a logical mirror image of the current picture (its conspansive dual), merges the symmetric halves of the resulting picture, and attempts to extract meaningful implications. Science as we now know it is thereby changed but little in return for what may, if fate smiles upon us, turn out to be vast gains in depth, significance and explanatory power.
...
As complexity rises and predicates become theories, tautology and truth become harder to recognize. Because universality and specificity are at odds in practice if not in principle, they are subject to a kind of “logical decoherence” associated with relational stratification. Because predicates are not always tautological, they are subject to various kinds of ambiguity; as they become increasingly specific and complex, it becomes harder to locally monitor the heritability of consistency and locally keep track of the truth property in the course of attribution (or even after the fact). Undecidability, LSAT intractability and NP-completeness, predicate ambiguity and the Lowenheim-Skolem theorem, observational ambiguity and the Duhem-Quine thesis…these are some of the problems that emerge once the truth predicate “decoheres” with respect to complex attributive mappings. It is for reasons like these that the philosophy of science has fallen back on falsificationist doctrine, giving up on the tautological basis of logic, effectively demoting truth to provisional status, and discouraging full appreciation of the tautological-syntactic level of scientific inquiry even in logic and philosophy themselves.

In fact, the validity of scientific theories and of science as a whole absolutely depends on the existence of a fundamental reality-theoretic framework spanning all of science…a fundamental syntax from which all scientific and mathematical languages, and the extended cognitive language of perception itself, can be grammatically unfolded, cross-related and validated. Tautology, the theoretical basis of truth as embodied in sentential logic, is obviously the core of this syntax. Accordingly, reality theory must be developed through amplification of this tautological syntax by adjunction of additional syntactic components, the principles of reality theory, which leave the overall character of the syntax invariant. Specifically, in order to fashion a reality theory that has the truth property in the same sense as does logic, but permits the logical evaluation of statements about space and time and law, we must adjoin principles of extension that lend meaning to such statements while preserving the tautology property.

According to the nature of sentential logic, truth is tautologically based on the integrity of cognitive and perceptual reality. Cognition and perception comprise the primitive (self-definitive) basis of logic, and logic comprises the rules of structure and inference under which perception and cognition are stable and coherent. So when we say that truth is heritable under logical rules of inference, we really mean that tautology is heritable, and that the primitive cognitive-perceptual basis of sentential logic thus maintains its primary status. By converting tautologies into other tautologies, the rules of inference of sentential logic convert cognitive-perceptual invariants into other such invariants. To pursue this agenda in reality theory, we must identify principles that describe how the looping structure of logical tautology is manifest in various reality-theoretic settings and contexts on various levels of description and interpretation; that way, we can verify its preservation under the operations of theoretic reduction and extension. I.e., we must adjoin generalized principles of loop structure to logical syntax in such a way that more and more of reality is thereby explained and comprehensiveness is achieved."
http://www.megafoundation.org/CTMU/Articles/Langan_CTMU_092902.pdf

"Now let's see if we can recap all of this.

Aristotelian metaphysics is universal, containing in principle all Ui-relevant information (Ui-potential) U*. A theory of metaphysics M is an open inferential system which, because necessarily univer sal, reduces to a Ui-recognizable tautology T on U* heritable in M via generalized rules of inference (where "generalized inference" is just logical substitution). As specific information equates inductively to ancestral generalisms, and U* is both unique and Ui-indiscernible from T, the identification M = T = U* is practically unconditional. Now suppose that there exist two Ui-distinguishable true metaphysical theories M and M’; i.e., two Ui-distinguishable Ui-tautologies T and T’. These can only be Ui-distinguishable by virtue of a nonempty Ui-informationa1 disjunction: i.e., disjoint information d = (T ∪ T’) - (T ∩ T’) > ∅ recognizable in/by Ui (where the information in T or T’ equals the scope (image) of its univer sal quantifier, and ∅ is the null set). This information d, being the distinction between two Ui-perceptible truths, exists in Ui and thus U*. But as it is disjoint information, one member of the pair (T, T’) does not contain it. So this member does not cover U*, is not a U* tautology, and thus is not a theory of metaphysics. On the other hand, M = Uj = 1, 2... Mj, where the jointly U*-exhaustive Mj are all "true", Ui-distinct, and M-nonexluded, does and is.

So the assumption fails, and there can be only one correct theory of metaphysics at the tautological level. This, by definition, is the CTMU. I.e., the CTMU takes this existential proof of metaphys ical uniqueness and uses the implied system as the identity of a transductive algebra meeting the conditions for human cognition by its homomorphic relationship to the human cognitive syntax. So for the human cognitive equivalency-class, the universe is generalistically identical to the CTMU tautology.

Soi-disant "metaphysicians" have been debating the merits of so-called metaphysical theories for centuries, usually claiming to argue from "logical" standpoints. The only accord they have been able to reach is an "agreement to disagree". Sadly, this has left the uncloistered masses with a level of metaphysical understanding not far above that which guided them through the last Ice Age, and science without a clue as to the meaning of what it is doing. If this is not a monumental injustice to humanity, then humanity has vastly overestimated its own importance."

"So far we have only considered mental states. What about physical states? Aha, here we take the position that there is no such thing. States are states, regardless of whether the points they are states of are part of a brain, part of a computer, or part of a swinging hammer. All states are mental, and the points that coordinatize state vectors are physical.

The counterparts of point and state for physics are time and energy respectively (along with space and momentum respectively but let's keep things simple here). Points in time can reasonably be considered physical, but surely energy is physical too (it's all physics after all)."
http://chu.stanford.edu/

Chu spaces as a semantic bridge between linear logic and mathematics:

"The motivating role of linear logic is as a “logic behind logic.” We propose a sibling role for it as a logic of transformational mathematics via the self-dual category of Chu spaces, a generalization of topological spaces. These create a bridge between linear logic and mathematics by soundly and fully completely interpreting linear logic while fully and concretely embedding a comprehensive range of concrete categories of mathematics. Our main goal is to treat each end of this bridge in expository detail. In addition we introduce the dialectic lambda calculus, and show that dinaturality semantics is not fully complete for the Chu interpretation of linear logic.

Keywords: Chu spaces, linear logic, universal mathematics."

"What branch of mathematics is sufficiently general to apply both to quantum physics and linguistics? You guessed it:

ANNOUNCEMENT

The Quantum and Computational Linguistics groups of the Oxford University Computing Laboratory will host a three-day workshop on the interplay between algebra and coalgebra that can be thought of as information flow, and its applications to quantum physics and linguistics.

TOPIC

The aim of the workshop is to bring people together from the fields of quantum groups, categorical quantum mechanics, logic, and linguistics, to exchange talks and ideas of a (co)algebraic nature, about the interaction between algebras (monoids) and coalgebras (comonoids) that can be thought of as “information flow”. Many such structure have been found useful across these fields, such as Frobenius algebras and bialgebras such as Hopf algebras. They have also showed up in grammatical and vector space models of natural language to for example encode meaning of verbs and logical connectives."


"In logic, a logical connective (also called a logical operator) is a symbol or word used to connect two or more sentences (of either a formal or a natural language) in a grammatically valid way, such that the compound sentence produced has a truth value dependent on the respective truth values of the original sentences.


Each logical connective can be expressed as a function, called a truth function. For this reason, logical connectives are sometimes called truth-functional connectives. The most common logical connectives are binary connectives (also called dyadic connectives) which join two sentences whose truth values can be thought of as the function's operands. Also commonly, negation is considered to be a unary connective.


Logical connectives along with quantifiers are the two main types of logical constants used in formal systems such as propositional logic and predicate logic."


"The hypersets are shown to play the truth values of stable properties of nondeterministc dynamical systems. In fact, the universe of hereditory nite hypersets, with the truth value as an atom added, is shown to be the subobject classi er of the category of simulations of nondeterministic dynamical systems."

Langan begins by recognizing the pioneering work of Physicist John Wheeler who united the world of information theory with quantum mechanics ("It from Bit").


"All things physical are information-theoretic in origin and this is a participatory universe... Observer participancy gives rise to information; and information gives rise to physics."[1]


In "How come existence", John Wheeler wrote:


"No escape is evident from four conclusions: (1) The world cannot be a giant machine, ruled by any pre-established continuum physical law. (2) There is no such thing at the microscopic level as space or time or spacetime continuum. (3) The familiar probability function or functional, and wave equation or functional wave equation, of standard quantum theory provide mere continuum idealizations and by reason of this circumstance conceal the information-theoretic source from which they derive. (4) No element in the description of physics shows itself as closer to primordial than the elementary quantum phenomenon, that is, the elementary device-intermediated act of posing a yes-no physical question and eliciting an answer or, in brief, the elementary act of observer participancy. Otherwise stated, every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications, a conclusion which we epitomize in the phrase, it from bit."[2]


The CTMU builds upon this work by retooling the information concept to incorporate reflexive self-processing in a reality-theoretic context so as to make it "self-transducing information", where information and cognition are recursively inter-defined, their nexus point is infocognition. A quantum of infocognition is a syntactic operator or a noeon.


"As readers of Noesis will recall, this crucial redefinition begins with a mutual, recursive interdefinition of information and cognition within a "reified tautology" called a quantum transducer. The quantum transducer, being paradoxiform by direct analogy with tautologically-based inference, models the way subjectively-tautological cognitive syntaxes transduce information in time. The universality of this model allows reality to be reduced to it, and thus to (cognitive) information. "Information" is the objective aspect of the quantum transducer for itself and for all others; it is cognition-for-cognition, equating generalistically to a cognitive identity relation on that part of reality to which it corresponds (i.e., the part containing all the transducers playing active and passive roles in it)."Langan, 1992, Noesis 76


"Because cognition and generic information transduction are identical up to isomorphism – after all, cognition is just the specific form of information processing that occurs in a mind – information processing can be described as “generalized cognition”, and the coincidence of information and processor can be referred to as infocognition. Reality thus consists of a single “substance”, infocognition, with two aspects corresponding to transduction and being transduced. Describing reality as infocognition thus amounts to (infocognitive) dual aspect monism. Where infocognition equals the distributed generalized self-perception and self-cognition of reality, infocognitive monism implies a stratified form of “panpsychism” in which at least three levels of self-cognition can be distinguished with respect to scope, power and coherence: global, agentive and subordinate.

[...]

Retooling the information concept consists of three steps. First, it must be equipped with the means of its own transduction or transformative processing. Where information transduction is (cognitively) recognized as generalized cognition, this amounts to replacing it with a dual-aspect quantum of reflexivity, infocognition, which embodies telic feedback. Second, its bit structure, a simplistic and rather uninspired blend of 2-valued propositional logic and probability theory, must be extended to accommodate logic as a whole, including (1) predicate logic, (2) model theory and (3) language theory, broadly including the theories of mathematical languages, metalanguages and generative grammars. After all, since information does nothing but attribute linguistically-organized predicates to objects in the context of models, its meaning involves the mathematics of predicates, languages and models. And third, it must be generalized to an ultimate ancestral medium, telesis, from which cognitive syntax and its informational content arise by specificative feedback as part of a unified complex…a recursive coupling of information and metainformation, or transductive syntax.

[...]

The answer is hiding in the question. Laws do not stand on their own, but must be defined with respect to the objects and attributes on which they act and which they accept as parameters. Similarly, objects and attributes do not stand on their own, but must be defined with respect to the rules of structure, organization and transformation that govern them. It follows that the active medium of cross-definition possesses logical primacy over laws and arguments alike, and is thus pre-informational and pre-nomological in nature…i.e., telic. Telesis, which can be characterized as “infocognitive potential”, is the primordial active medium from which laws and their arguments and parameters emerge by mutual refinement or telic recursion."Langan, 2002, PCID, pg. 33-35


"The similarity between the ideas of John Wheeler and other leading scientists and some ancient philosophies and scriptures are striking. [...] And as we do, let us not forget that knowledge is not all objective. Subject knowledge (call it intuitive, instinctive or revelationary) and objective knowledge (call it rational or scientific) go hand in hand. Observers, whether they be fundamental particles or human beings need to be brought into the picture too for it is them and their sensory and perceptional tools that give rise to the illusion of time. The digitization of perception in terms of bits of information may well be the way to go to complete the picture. Quite relevant to this effort are the provocative ideas contained Stephen Wolfram's book, A New Kind of Science. That is because, he has laid out the foundations for a program for a renewed understanding of all aspects of Nature recognizing that everything in Nature is ultimately digital and therefore the best tools to probe into its secrets are digital concepts and computer algorithms. This is especially appealing to the Hindu mind for it recognizes that the very first step in creation from a state of non-duality (Advaita) to a state of duality (Dvaita) is a binary process."[3]


"Reality as a Cellular Automaton: Spacetime Trades Curves for Computation

At the dawn of the computer era, the scientific mainstream sprouted a timely alternative viewpoint in the form of the Cellular Automaton Model of the Universe, which we hereby abbreviate as the CAMU. First suggested by mathematician John von Neumann and later resurrected by salesman and computer scientist Ed Fredkin, the CAMU represents a conceptual regression of spacetime in which space and time are re-separated and described in the context of a cellular automaton. Concisely, space is represented by (e.g.) a rectilinear array of computational cells, and time by a perfectly distributed state transformation rule uniformly governing cellular behavior. Because automata and computational procedures are inherently quantized, this leads to a natural quantization of space and time. Yet another apparent benefit of the CAMU is that if it can be made equivalent to a universal computer, then by definition it can realistically simulate anything that a consistent and continually evolving physical theory might call for, at least on the scale of its own universality.

But the CAMU, which many complexity theorists and their sympathizers in the physics community have taken quite seriously, places problematic constraints on universality. E.g., it is not universal on all computational scales, does not allow for subjective cognition except as an emergent property of its (assumedly objective) dynamic, and turns out to be an unmitigated failure when it comes to accounting for relativistic phenomena. Moreover, it cannot account for the origin of its own cellular array and is therefore severely handicapped from the standpoint of cosmology, which seeks to explain not only the composition but the origin of the universe. Although the CAMU array can internally accommodate the simulations of many physical observables, thus allowing the CAMU’s proponents to intriguingly describe the universe as a “self-simulation”, its inability to simulate the array itself precludes the adequate representation of higher-order physical predicates with a self-referential dimension.

Reality as Reality Theory: Spacetime Turns Introspective

Now let us backtrack to the first part of this history, the part in which René Descartes physically objectivized Cartesian spaces in keeping with his thesis of mind-body duality. Notice that all of the above models sustain the mind-body distinction to the extent that cognition is regarded as an incidental side effect or irrelevant epiphenomenon of objective laws; cognition is secondary even where space and time are considered non-independent. Yet not only is any theory meaningless in the absence of cognition, but the all-important theories of relativity and quantum mechanics, without benefit of explicit logical justification, both invoke higher-level constraints which determine the form or content of dynamical entities according to properties not of their own, but of entities that measure or interact with them. Because these higher-level constraints are cognitive in a generalized sense, GR and QM require a joint theoretical framework in which generalized cognition is a distributed feature of reality.

Let’s try to see this another way. In the standard objectivist view, the universe gives rise to a theorist who gives rise to a theory of the universe. Thus, while the universe creates the theory by way of a theorist, it is not beholden to the possibly mistaken theory that results. But while this is true as far as it goes, it cannot account for how the universe itself is created. To fill this gap, the CTMU Metaphysical Autology Principle or MAP states that because reality is an all-inclusive relation bound by a universal quantifier whose scope is unlimited up to relevance, there is nothing external to reality with sufficient relevance to have formed it; hence, the real universe must be self-configuring. And the Mind-Equals-Reality (M=R) Principle says that because the universe alone can provide the plan or syntax of its own self-creation, it is an "infocognitive" entity loosely analogous to a theorist in the process of introspective analysis. Unfortunately, since objectivist theories contain no room for these basic aspects of reality, they lack the expressive power to fully satisfy relativistic, cosmological or quantum-mechanical criteria. The ubiquity of this shortcoming reflects the absence of a necessary and fundamental logical feature of physical analysis, a higher order of theorization in which theory cognitively distributes over theory, for which no conventional theory satisfactorily accounts.

In view of the vicious paradoxes to which this failing has led, it is only natural to ask whether there exists a generalization of spacetime that contains the missing self-referential dimension of physics. The answer, of course, is that one must exist, and any generalization that is comprehensive in an explanatory sense must explain why. In Noesis/ECE 139, the SCSPL paradigm of the CTMU was described to just this level of detail. Space and time were respectively identified as generalizations of information and cognition, and spacetime was described as a homogeneous self-referential medium called infocognition that evolves in a process called conspansion. Conspansive spacetime is defined to incorporate the fundamental concepts of GR and QM in a simple and direct way that effectively preempts the paradoxes left unresolved by either theory alone. Conspansive spacetime not only incorporates non-independent space and time axes, but logically absorbs the cognitive processes of the theorist regarding it. Since this includes any kind of theorist cognitively addressing any aspect of reality, scientific or otherwise, the CTMU offers an additional benefit of great promise to scientists and nonscientists alike: it naturally conduces to a unification of scientific and nonscientific (e.g. humanistic, artistic and religious) thought."

http://www.megafoundation.org/CTMU/Articles/Supernova.html


"All maps must be of a certain scale or combination of scales, just a
s every grid must have a certain resolution or granularity of cells. And since reality itself (as Gibson 1979 emphasizes) contains entities accessible at many different scales, it follows that no single grid can be complete.

Rather, as scientific practice shows, we need grids of many different resolutions if we are to do justice to reality in its many aspects. This implies, as the enemies of realism are fond of pointing out, that there is no ‘God’s eye perspective’ or ‘view from nowhere’. This does not, however, mean that we are justified in drawing the conclusion that every single one of the myriad perspectives which we have at our disposal embodies a false view of reality. The inference from partiality to falsehood might indeed be valid, but only in a world without windows – a world in which no single one of our grids enjoys the condition of transparency.

The fact that there are maps which deviate, for whatever reason, from the strictly veridical representation of reality does not take away from the fact that – leaving aside any small errors which may have been made in the application of the relevant projection system – almost all maps are true of the corresponding portion of reality.

This applies to Mercator’s map, and it even applies to Saul Steinberg’s View of the World from Ninth Avenue. Maps must of course embody some projection system in representing three dimensions on a planar surface. Yet those who see in this an argument to the effect that all maps must necessarily involve some form of systematic distortion are simply revealing their own misunderstanding of the nature of projection."
http://ontology.buffalo.edu/smith/articles/truegrid.pdf

"The core idea is that supervaluationistic semantics can be reconstructed, not on the basis of mappings between sentences and abstract context-free models, but rather on the basis of mappings between partitions determined by real-world contexts at different levels of granularity.

Further open problem-domains for the extension of GOL relate to the ontology of measurement and of quantity, and to the question of how to develop formal methods which will enable mappings between quantitative and qualitative data and information. This problem-domain, too, is connected with the factors of vagueness and granularity."
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.23.4454&rep=rep1&type=pdf

Whitehead and the measurement problem of cosmology

"Einstein's General Theory of Relativity links the metrical structure of the cosmic order (or 'cosmology') to the contingent distributions of matter and energy throughout the universe, one of the chief areas of investigation in astrophysics. However, presently we have neither devised nor discovered a system of uniform relations whereby we can make our cosmological measurements intelligible. This is 'the measurement problem of cosmology'. Using both historical ideas (such as A N Whitehead's work in the 1920s) and contemporary evidence and theories, I argue that the measurement problem has neither been fully understood nor rightly interpreted. With a better grasp of this problem, such as I am attempting to provide, the prospects for a solution look brighter."
http://books.google.com/books?id=hFg05lytuAkC&output=html_text&source=gbs_navlinks_s

Quantum Logical Causality, Category Theory, and the Metaphysics of
Alfred North Whitehead

Connecting Zafiris’ Category Theoretic Models of Quantum Spacetime
and the Logical-Causal Formalism of Quantum Relational Realism

"This is one of a series of focused workshops exploring the phenomenon of logical causality in quantum mechanics and the relevance of this phenomenon to the philosophy of nature more broadly. In this meeting, we will investigate the ways in which the work of Elias Zafiris on category theoretic models of spacetime quantum event structures might find a robust philosophical and physical foundation in the Relational Realist approach to quantum logical causality—a modern rehabilitation of the process event-ontology of Whitehead. Among the topics to be explored will be the relationship between a) Zafiris’s category theoretic / topos model and b) the decoherent histories interpretation of quantum mechanics in the context of the Whiteheadian mereotopological model of spatiotemporal extension.

The macro-scale (spatiotemporal, cosmological) implications of these explorations of decoherence at the micro-scale will be examined by way of a Whiteheadian/Relational Realist interpretation of the decoherent histories QM formalism (cf. Epperson, 2004). The decoherent histories formalism allows for a spacetime formulation of quantum theory that is highly compatible with the mereotopological model of spatiotemporal extension proposed by Alfred North Whitehead. This is crucial because the mathematical rigidity of the Hilbert space (as a topological vector space) does not allow a relativization analogous with the one of classical relativity theory on smooth manifolds. By contrast, a mereotopological/category-theoretic reformulation of quantum logic in terms of Boolean localization systems (Zafiris’s conception of ‘Booelan sheaves’) achieves precisely such an objective by generalizing the smooth manifold construction in generic algebraic/categorical terms.

This makes it possible to formulate a framework of local/global or part/whole relations without the intervention of a spacetime classically conceived. On the contrary, the usual spacetime manifold and its metrical relations (imposing extra conditions like the light-cone causality relations) appear as emergent at a higher level than the more fundamental level of logical and algebraic part-whole relations. Epperson’s distinction between a) mereological/topological/logical relations, and b) physical/metrical/causal relations, is reflected precisely in Zafiris’ distinction between these two different levels (the algebraic part-whole relations and the metrical spacetime relations, respectively), where the latter is just an emergent/metrical specialization of the former. The key issue to be explored is the problem of localization and the problem of passing from the local to the global in an extensive continuum.

III. Speakers

Elias Zafiris, Senior Research Fellow in Theoretical and Mathematical Physics, Institute of Mathematics, University of Athens, Greece

Elias Zafris holds an M.Sc. (Distinction) in `Quantum Fields and Fundamental Forces' from Imperial College, University of London, and a Ph.D. in `Theoretical Physics' from Imperial College as well. He has published research papers on the following areas: Generalized spacetime quantum theory and the decoherent histories approach to quantum theory, symmetries and exact solutions in general relativity, covariant kinematics of relativistic strings and branes, foundations of quantum physics, quantum event and quantum observable structures, category-theoretic methods in quantum physics and complex systems theories, topological localization and modern differential geometry in quantum field theory and quantum gravity. His current research focus is on the development of a functorial sheaftheoreticapproach to quantum mechanics, quantum logic and quantum gravity using concepts and techniques of mathematical category theory and algebraic differential geometry, as well as on the study of its conceptual and interpretational implications.

Michael Epperson, Center for Philosophy and the Natural Sciences, California State University Sacramento

Michael Epperson did his doctoral work in philosophy of science and philosophy of religion at the University of Chicago, and earned his Ph.D. there in 2003. His dissertation, Quantum Mechanics and the Philosophy of Alfred North Whitehead, was written under the direction of philosopher David Tracy and physicist Peter Hodgson, Head of the Nuclear Physics Theoretical Group at the University of Oxford. It was published the following year by Fordham University Press. His current research explores the philosophical implications of recent innovations in quantum mechanics, cosmology, and complexity theory. This exploration is ultimately a speculative metaphysical enterprise intended to contribute to the framework of a suitable bridge by which scientific, philosophical, and even theological concepts might not only be cross-joined, but mutually supported. His forthcoming book, co-edited with David Ray Griffin and Timothy E. Eastman, is entitled, Physics and Speculative Philosophy: The Rehabilitation of Metaphysics in 21st Century Science. Epperson is the founder and director of the Center for Philosophy and the Natural Sciences at California State University, Sacramento, and Principal Investigator of his current research project,“Logical Causality in Quantum Mechanics: Relational Realism and the Evolution of Ontology to Praxiology in the Philosophy of Nature.”

Karim Bschir, Chair of Philosophy, Swiss Federal Institute of Technology, Zurich

Karim Bschir studied biochemistry and philosophy at the University of Zurich. In 2003 he received a M.Sci. in biochemistry with a thesis on protein chemistry (thesis title: “In vitro Arginine Methylation of Recombinant Ewing Sarcoma (EWS) Protein”. See: Proteins 61 (1): 164-175, 2005.) In 2003 and 2004 he continued his studies in philosophy. During that time, he was also working as a high school teacher for philosophy and as a subject specialist in an exhibition of the Swiss National Museum in Zurich about recent developments in the Life Sciences. In November 2004, he commenced a Ph.D.project in philosophy at the University of Zurich. Since January 2007, he is continuing his Ph.D.studies at ETH Zurich. His philosophical work focuses on scientific realism and the question whether and how empirical sciences relate to reality. Karim Bschir has strong interests in general philosophy and history of science, epistemology and metaphysics, as well as in the philosophy of biology.

Kelly John Rose, Institute for Biocomplexity and Informatics, University of Calgary

Kelly John Rose received his M.Sci. in Applied Mathematics from the Department of Mathematics and Statistics at the University of Calgary in 2009. He is affiliated with the Institute for Biocomplexity and Informatics as well as the Haskayne School of Business in Calgary. Currently, he resides in Toronto and is a senior partner in a software development consulting firm and works regularly with the complex systems group at the Perimeter Institute. His previous experience includes: Working as a research assistant and software developer at the Institute for Quantum Computing at the University of Waterloo, and performing cryptographic research for the Canadian government. At this time, Rose’s interests focus on the relations between information theoretic measures and the input-output matrices in economical and ecological systems, and developing new mathematical tools for understanding such complex systems."
Elias Zafiris, Complex Systems From the Perspective of Category Theory: II. Covering Systems and Sheaves.

"Using the concept of adjunctive correspondence, for the comprehension of the structure of a complex system, developed in Part I, we introduce the notion of covering systems consisting of partially or locally defined adequately understood objects. This notion incorporates the necessary and sufficient conditions for a sheaf theoretical representation of the informational content included in the structure of a complex system in terms of localization systems. Furthermore, it accommodates a formulation of an invariance property of information communication concerning the analysis of a complex system."

‎"This paper serves as the meeting point of three “parallel worlds”: Chu spaces, Domain Theory, and Formal Concept Analysis. It brings the three independent areas together and establishes fundamental connections among them, leaving open opp
ortunities for the exploration of cross-disciplinary influences.

We begin with an overview of each of the three areas, followed by an account of the background of each area from a unified perspective. We then move to basic connections among them and point to topics of immediate interest and opportunities for further development, including applications in data-mining and knowledge discovery.

Due to its interdisciplinary nature, the paper is written in a way that does not assume specific background knowledge for each area."
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.94.6897&rep=rep1&type=pdf

‎"Apart from minimizing the number of required measurements, the computational effort of signal reconstruction is important for numerical efficiency. Hierarchical structures mimicking our cognitive system seem to offer a good trade-off between performance and computational effort. Wavelets, which are used for the encoding of images for High-Definition Television, incorporate such an efficient hierarchical structure, but their redundancy-free design does not provide much flexibility to realize desirable symmetry properties, leading to commonly known block artifacts in images. Within the last few years, efficient image encoding techniques have been emerging, which avoid directional preferences with the help of frame representations.

Certain situations require a more refined notion of redundancy, for example, when sensors have been somewhat randomly scattered across a terrain or, in medical applications, a patient body. Assuming a fixed monitoring range for each sensor, they may overlap to varying degrees in different locations, which means they report with a varying amount of repetitive information. The flexible architecture of fusion frames offers a general setting to explore optimal designs in this context. In fact, the underlying concept may be a more realistic model for our cognitive process, and allow us to realize its versatility in many applications of signal analysis and communication."
http://www.birs.ca/events/2009/5-day-workshops/09w5082

Intensional double glueing, biextensional collapse, and the Chu construction:
http://boole.stanford.edu/~dominic/papers/ig/ig.pdf

"The nerve of a category is often used to construct topological versions of moduli spaces. If X is an object of C, its moduli space should somehow encode all objects isomorphic to X and keep track of the various isomorphisms between all of these objects in that category. This can become rather complicated, especially if the objects have many non-identity automorphisms. The nerve provides a combinatorial way of organizing this data. Since simplicial sets have a good homotopy theory, one can ask questions about the meaning of the various homotopy groups πn(N(C)). One hopes that the answers to such questions provide interesting information about the original category C, or about related categories.

The notion of nerve is a direct generalization of the classical notion of classifying space of a discrete group; see below for details."

http://en.wikipedia.org/wiki/Nerve_(category_theory)


A Cellular Nerve for Higher Order Categories:

"Generally, for X an object we think of as a space, a cover of X is some other object Y together with a morphism π:YX, usually an epimorphism demanded to be well behaved in certain way.

The idea is that Y provides a “locally resolved” picture of X in that X and Y are “locally the same” but thatY is “more flexible” than X.

The archetypical example are ordinary covers of topological spaces X by open subsets {U i}: here Y is their disjoint union Y:= iU i.

More generally, you might need a cover to be family of maps (π i:Y iX) i; if the category has a coproducts that get along well with the covers, then you can replace these families with single maps as above."


A geometry of information, I: Nerves, posets and differential forms

"The main theme of this workshop (Dagstuhl seminar 04351) is `Spatial Representation: Continuous vs. Discrete'. Spatial representation has two contrasting but interacting aspects (i) representation of spaces' and (ii) representation by spaces. In this paper, we will examine two aspects that are common to both interpretations of the theme, namely nerve constructions and refinement. Representations change, data changes, spaces change. We will examine the possibility of a `differential geometry' of spatial representations of both types, and in the sequel give an algebra of differential forms that has the potential to handle the dynamical aspect of such a geometry. We will discuss briefly a conjectured class of spaces, generalising the Cantor set which would seem ideal as a test-bed for the set of tools we are developing."
http://arxiv.org/abs/cs.AI/0512010

A geometry of information, II: Sorkin models, and biextensional collapses.

"In this second part of our contribution to the workshop, we look in more detail at the Sorkin model, its relationship to constructions in Chu space theory, and then compare it with the Nerve constructions given in the first part."
http://drops.dagstuhl.de/volltexte/2005/127/pdf/04351.PorterTimothy2.Paper.127.pdf

A Spatial View of Information

"Spatial representation has two contrasting but interacting aspects (i) representation of spaces’ and (ii) representation by spaces. In this paper we will examine two aspects that are common to both interpretations of the theme of spatial representation, namely nerve-type constructions and refinement. We consider the induced structures, which some of the attributes of the informational context are sampled.

Key words: Chu spaces, sampling, formal context, nerves, poset models,
biextensional collapse."
http://www.maths.bangor.ac.uk/research/ftp/cathom/06_09.pdf

Res. Group on Knowledge & Commun. Models, Roma, Italy
This paper appears in: Granular Computing, 2005 IEEE International Conference on

Information quanta and approximation spaces. I. Non-classical approximation operators

"In the first part of this paper property systems are investigated in order to define intensional and extensional operators fulfilling adjointness properties. On this basis a set of operators are defined which are able to determine non-classical upper and lower approximations of subsets of objects or properties. But in order to bypass "discontinuity" of such operators we introduce, in the second part, the higher order notion of an "information quantum". This way we can account for attribute systems as well as property systems by transforming them into new structures called "information quantum relational systems" in which adjointness makes operators fulfill continuity and makes it possible to define a class of generalised approximation spaces."
http://ieeexplore.ieee.org/xpl/freeabs_all.jsp?arnumber=1547363

Default reasoning over domains and concept hierarchies:

"W.C. Rounds and G.-Q. Zhang have proposed to study a form of disjunctive logic programming generalized to algebraic domains [1]. This system allows reasoning with information which is hierarchically structured and forms a (suitable) domain. We extend this framework to include reasoning with default negation, giving rise to a new nonmonotonic reasoning framework on hierarchical knowledge which encompasses answer set programming with extended disjunctive logic programs. We also show that the hierarchically structured knowledge on which programming in this paradigm can be done, arises very naturally from formal concept analysis. Together, we obtain a default reasoning paradigm for conceptual knowledge which is in accordance with mainstream developments in nonmonotonic reasoning."
http://knoesis.wright.edu/faculty/pascal/resources/publications/pdf/ki04.pdf

Formal Concept Analysis and Resolution in Algebraic Domains — Preliminary Report:
http://knoesis.org/faculty/pascal/resources/publications/pdf/wv-03-01.pdf

Closures in Binary Partial Algebras:
http://newton.case.edu/papers/pba.pdf

Bifinite Chu Spaces:
http://arxiv.org/PS_cache/arxiv/pdf/0911/0911.3214v2.pdf

A Categorical View on Algebraic Lattices in Formal Concept Analysis:
http://arxiv.org/PS_cache/cs/pdf/0410/0410065v1.pdf

A categorical view at generalized concept lattices:
http://pdf.dml.cz/bitstream/handle/10338.dmlcz/135771/Kybernetika_43-2007-2_12.pdf

Rough concept lattices and domains:
http://125.71.228.222/wlxt/ncourse/inttopol/web/website/website01/pdf/leipaper2.pdf

Formal Concept Analysis and Resolution in Algebraic Domains:

"We relate two formerly independent areas: Formal concept analysis and logic of domains. We will establish a correspondene between contextual attribute logic on formal contexts resp. concept lattices and a clausal logic on coherent algebraic cpos. We show how to identify the notion of formal concept in the domain theoretic setting. In particular, we show that a special instance of the resolution rule from the domain logic coincides with the concept closure operator from formal concept analysis. The results shed light on the use of contexts and domains for knowledge representation and reasoning purposes."
http://arxiv.org/abs/cs.LO/0301008

Formal Topology, Chu Space and Approximable Concept:
http://ftp.informatik.rwth-aachen.de/Publications/CEUR-WS/Vol-162/paper14.pdf

GROTHENDIECK TOPOLOGIES ON CHU SPACES

"We consider the Grothendieck topologies on low semi-lattices, defined by one family, and the corresponding sheaf cohomology. This is a basis to define and study the left and right cohomologies and the left and right dimensions of the Chu spaces. The construction of Chu spaces allows to characterize a large class of quantities, for example, the dimension of a Noether space or the Krull dimension of a ring, the Lebesgue-type dimensions, as well as to compare them with the cohomology dimensions of the corresponding Chu spaces. We prove existence of spectral sequences of the morphisms of the Chu spaces.

Key words and phrases: Grothendieck topology, sheaf cohomology, Chu
space, cohomological dimension, flabby dimension, Lebesgue-type dimension,
spectral sequence."
http://www.math.nsc.ru/mattrudy/engl/archive/files/ET_11_2/ET112A6.PDF

Conceptual structures: knowledge architectures for smart applications:
http://tinyurl.com/ConceptualStructures

Mediating Secure Information Flow Policies:
http://newton.eecs.cwru.edu/papers/Zhang-mediating-llncs.pdf

Using Situation Lattices to Model and Reason about Context:

"Much recent research has focused on using situations rather than individual pieces of context as a means to trigger adaptive system behaviour. While current research on situations emphasises their representation and composition, they do not provide an approach on how to organise and identify their occurrences efficiently. This paper describes how lattice theory can be utilised to organise situations, which reflects the internal structure of situations such as generalisation and dependence. We claim that situation lattices will prove beneficial in identifying situations, and maintaining the consistency and integrity of situations. They will also help in resolving the uncertainty issues inherent in context and situations by working with Bayesian Networks."
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.109.817

Bounded hyperset theory and web-like data bases:

No comments:

Post a Comment