Object theory


Object theory is a theory in mathematical logic concerning objects and the statements that can be made about objects.
In some cases "objects" can be concretely thought of as symbols and strings of symbols, here illustrated by a string of four symbols " ←←↑↓←→←↓" as composed from the 4-symbol alphabet . When they are "known only through the relationships of the system , the system is abstract... what the objects are, in any respect other than how they fit into the structure, is left unspecified." A further specification of the objects results in a model or representation of the abstract system, "i.e. a system of objects which satisfy the relationships of the abstract system and have some further status as well".
A system, in its general sense, is a collection of objects O = and the relationship r or relationships r1, r2,... rn between the objects.
A model of this system would occur when we assign, for example the familiar natural numbers, to the symbols, i.e. in this manner: → = 0, ↑ = 1, ← = 2, ↓ = 3. Here, the symbol indicates the "successor function" operating on a collection of only 4 objects, thus 0' = 1, 1' = 2, 2' = 3, 3' = 0.

The genetic versus axiomatic method

The following is an example of the genetic or constructive method of making objects in a system, the other being the axiomatic or postulational method. Kleene states that a genetic method is intended to "generate" all the objects of the system and thereby "determine the abstract structure of the system completely" and uniquely. If axioms rather than a genetic method is used, such axiom-sets are said to be categorical.
Unlike the example above, the following creates an unbounded number of objects. The fact that O is a set, and □ is an element of O, and ■ is an operation, must be specified at the outset; this is being done in the language of the metatheory :

Abbreviations

The object ■n□ demonstrates the use of "abbreviation", a way to simplify the denoting of objects, and consequently discussions about them, once they have been created "officially". Done correctly the definition would proceed as follows:
Kurt Gödel 1931 virtually constructed the entire proof of his incompleteness theorems by use of this tactic, proceeding from his axioms using substitution, concatenation and deduction of modus ponens to produce a collection of 45 "definitions" from the axioms.
A more familiar tactic is perhaps the design of subroutines that are given names, e.g. in Excel the subroutine " =INT" that returns to the cell where it is typed the integer it finds in cell A1.

Models

A model of the above example is a left-ended Post–Turing machine tape with its fixed "head" located on the left-end square; the system's relation is equivalent to: "To the left end, tack on a new square □, right-shift the tape, then print ■ on the new square". Another model is the natural numbers as created by the "successor" function. Because the objects in the two systems e.g. and can be put into a 1-1 correspondence, the systems are said to be isomorphic. Yet another isomorphic model is the little sequence of instructions for a counter machine e.g. "Do the following in sequence: Dig a hole. Into the hole, throw a pebble. Go to step 2."
As long as their objects can be placed in one-to-one correspondence models can be considered "equivalent" no matter how their objects are generated :

Tacit assumptions, tacit knowledge

An alert reader may have noticed that writing symbols □, ■□, ■■□, ■■■□, etc. by concatenating a marked square, i.e. ■, to an existing string is different from writing the completed symbols one after another on a Turing-machine tape. Another entirely possible scenario would be to generate the symbol-strings one after another on different sections of tape e.g. after three symbols: ■■■□■■□■□□. The proof that these two possibilities are different is easy: they require different "programs". But in a sense both versions create the same objects; in the second case the objects are preserved on the tape. In the same way, if a person were to write 0, then erase it, write 1 in the same place, then erase it, write 2, erase it, ad infinitum, the person is generating the same objects as if they were writing down 0 1 2 3... writing one symbol after another to the right on the paper.
Once the step has been taken to write down the symbols 3 2 1 0 one after another on a piece of paper, or writing ∫∫∫※∫∫※∫※※ in a similar manner, then putting them in 1-1 correspondence with the Turing-tape symbols seems obvious. Digging holes one after the other, starting with a hole at "the origin", then a hole to its left with one pebble in it, then a hole to its left with two pebbles in it, ad infinitum, raises practical questions, but in the abstract it too can be seen to be conducive to the same 1-1 correspondence.
However, nothing in particular in the definition of genetic versus axiomatic methods clears this up—these are issues to be discussed in the metatheory. The mathematician or scientist is to be held responsible for sloppy specifications. Breger cautions that axiomatic methods are susceptible to tacit knowledge, in particular, the sort that involves "know-how of a human being".

A formal system

In general, in mathematics a formal system or "formal theory" consists of "objects" in a structure:
A metatheory exists outside the formalized object theory—the meaningless symbols and relations and strings of symbols. The metatheory comments on these meaningless objects using "intuitive" notions and "ordinary language". Like the object theory, the metatheory should be disciplined, perhaps even quasi-formal itself, but in general the interpretations of objects and rules are intuitive rather than formal. Kleene requires that the methods of a metatheory be finite, conceivable, and performable; these methods cannot appeal to the completed infinite. "Proofs of existence shall give, at least implicitly, a method for constructing the object which is being proved to exist."
Kleene summarizes this as follows: "In the full picture there will be three separate and distinct "theories""
He goes on to say that object theory is not a "theory" in the conventional sense, but rather is "a system of symbols and of objects built from symbols ".

Expansion of the notion of formal system

Well-formed objects

If a collection of objects is to be considered "well-formed", an algorithm must exist to determine, by halting with a "yes" or "no" answer, whether or not the object is well-formed. This algorithm, in the extreme, might require a Turing machine or Turing-equivalent machine that "parses" the symbol-string as presented as "data" on its tape; before a universal Turing machine can execute an instruction on its tape, it must parse the symbols to determine the exact nature of the instruction and/or datum encoded there. In simpler cases a finite state machine or a pushdown automaton can do the job. Enderton describes the use of "trees" to determine whether or not a logic formula is well formed. Alonzo Church 1934 describes the construction of "formulas" as written in his λ-calculus by use of a recursive description of how to start a formula and then build on the starting-symbol using concatenation and substitution.
Example: Church specified his λ-calculus as follows. This example shows how an object theory begins with a specification of an object system of symbols and relations :

Undefined (primitive) objects

Certain objects may be "undefined" or "primitive" and receive definition by the introduction of the axioms.
In the next example, the undefined symbols will be. The axioms will describe their behaviors.

Axioms

Kleene observes that the axioms are made up of two sets of symbols: the undefined or primitive objects and those that are previously known. In the following example, it is previously known in the following system that O constitutes a set of objects, ※ is an object in the domain, and are symbols for relations between the objects, => indicates the "IF THEN" logical operator, ε is the symbol that indicates "is an element of the set O", and "n" will be used to indicate an arbitrary element of set-of-objects O.
After a definition of "string S"—an object that is a symbol ※ or concatenated symbols ※, ↀ or ∫, and a definition of "well-formed" strings -- ※ and ↀS, ∫S where S is any string, come the axioms:
So what might be the interpretation of these symbols, definitions, and axioms?
If we define ※ as "0", ∫ as "successor", and ↀ as "predecessor" then ↀ※ => ※ indicates "proper subtraction". The string " ↀ∫n => n " indicates that if first the successor is applied to an arbitrary object n and then the predecessor ↀ is applied to ∫n, the original n results."
Is this set of axioms "adequate"? The proper answer would be a question: "Adequate to describe what, in particular?" "The axioms determine to which systems, defined from outside the theory, the theory applies.". In other words, the axioms may be sufficient for one system but not for another.
In fact, it is easy to see that this axiom set is not a very good one—in fact, it is inconsistent :
Observe also that the axiom set does not specify that ∫n ≠ n. Or, excepting the case n = ※, ↀn ≠ n. If we were to include these two axioms we would need to describe the intuitive notions "equals" symbolized by = and not-equals symbolized by ≠.