Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Negations in axioms #4

Closed
rnd0101 opened this issue Jan 1, 2018 · 19 comments
Closed

Negations in axioms #4

rnd0101 opened this issue Jan 1, 2018 · 19 comments
Labels
discussion Project discussions question

Comments

@rnd0101
Copy link

rnd0101 commented Jan 1, 2018

Negative formulations of some axioms place cognitive burden on the reader, plus make implicit assumptions on the "universal set" implied. I guess, the intention is to safeguard the machinery by making some obs non-equal, but maybe explicit is better than implicit, even at the cost of adding more is-functions?

@orcmid orcmid added the question label Jan 1, 2018
@orcmid
Copy link
Owner

orcmid commented Jan 1, 2018

Issue Closed

Although negations may be more difficult in some provers and checking techniques, they will continue to be used as a way of establishing broad inequalities, with tolerance for addition of additional constructions in the future.

The resolution statement summarizes the approach taken and the basis for it.


I need to look further into this. Consider this for now.

The supply of individuals is not necessarily exhausted by the obaptheory primitives and the lindies. I know of a couple of extensions that work by adding individuals. I suspect that introduction of types will involve more.

The purpose of

obap.is-lindy(x) ∧ ¬ obap.is-lindy(y) ⇒ xy

is to establish that there is no non-lindy that is equal to any lindy. This combines with

obap.is-lindy(x) ⇔ x = Ʃs,

being a meta-linguistic device with s not in the domain of discourse. Perhaps that is a sufficient condition?

What are your ideas on a way of affirming this without negation?

Also, is this a challenge for Coq for some reason?

@rnd0101
Copy link
Author

rnd0101 commented Jan 2, 2018

First of all, it's not a challenge for Coq.

Many uses of negation are probably justified when related to establishing negation of equality.

There are a couple of cases in obaptheory:

        obap.is-pure-lindy(p) ∧ obap.is-lindy-everywhere(x)
                => ob.is-pair(p) ⇒ obap.ap(p,x) = p :: x           
        ¬ ( obap.is-pure-lindy(p) ∧ obap.is-lindy-everywhere(x) )
=> ob.is-pair(p) ⇒ obap.ap(p,x) = obap.ev(p,x,p)

and

        obap.is-lindy(p) ∧ obap.is-lindy-everywhere(x)  
                         => obap.apint(p,x) = p :: x
        obap.is-lindy(p) ∧ ¬ obap.is-lindy-everywhere(x)  
=> obap.apint(p,x) = p :: ob.e(x)

plus

        ¬ ( obap.is-evbinop(e1) ∨ obap.is-evunop(e1) ) 
⇒ obap.ev(p,x,e1::e2) = obap.ap(obap.ev(p,x,e1), obap.ev(p,x,e2))

That last one is especially good example, because instead of telling range for the rule, it tells it as negation. If e1 domain is to be extended by say obap.is-ternop, then this part will need to be touched.

My grievance may be mostly aesthetic. In theory as well as in the code I value nearness to the modelled domain, and when things are named properly, etc.

So this is not hard issue. This is where IMHO theory can be more beautiful.

@rnd0101
Copy link
Author

rnd0101 commented Jan 2, 2018

I also do not want to be just critical. I am thinking on what can be done with it.

@orcmid
Copy link
Owner

orcmid commented Jan 2, 2018

< @rnd0101 I also do not want to be just critical. I am thinking on what can be done with it.

I appreciate that very much. I view all of your comments as contributions and I am grateful for them. They help in my own thinking. I share your desire that the theory be more beautiful.

It is late and I will return to these questions tomorrow.

@rnd0101
Copy link
Author

rnd0101 commented Jan 2, 2018

Sure. My vacation ended, so I have less time...

My quick thought on remedy to this is to align is-functions with types / subtypes. (I actually see existing ones as ad hoc typing). Another way to think on it is to add notion of interfaces. So basically rules become if provides-interface x obap.LINDY, then... Usually, nobody uses negation on interfaces, only in "else" clause of the rule, where negation is ok. I even think interfaces do not need types under them... But this needs some thinking.

@orcmid
Copy link
Owner

orcmid commented Jan 3, 2018

I'm working backwards through the comments. It was a rest day for me, so I might not say much tonight. Here are a few responses on the current Question stream.

  • Types and subtypes can be used in an interpretation. There is no prohibition on that, and the SML mockup uses datatype definitions. That does not have to impact the single/simple-type mathematical theory, where the universe of discourse is solely the collection of obs, Ob.

  • I hope to explain lindies better under Question What's the REPL/Ob input/output notation? #3. The conditions that stand out for you (and others) are a hack and I need to explain the motivation for it. It is for a handy kind of debugging/demonstration of operation. I chose to enshrine it in the mathematical theory so that it happens the same in every implementation and there are equivalents to the check code used in the SML mockup. There's more to come on that.

  • You are correct about the way I use negation having to be expanded if, say, a ternary operator is introduced. I will find a way to fix that so any extension does not require adjustment in more than one place. I trust that will be more explanatory also. I will address that at once. update 2018-01-03: reflected in obaptheory.txt 0.0.23.


I looked up more on Coq and understand that the proof assistant can also be used to produced "verified" code.

The following may already be understood. I want to lay it out so there is certainty about the separation I maintain between levels of abstraction and mathematical theory versus engineered computational operation.

In my work, I am not wedded to the mathematical formulation corresponding directly to an implementation. On the contrary. I want to celebrate the difference, even when there is similar nomenclature and recognition of the correspondence. Implementations need to be satisfactory interpretations of the theory that the mathematical formulation expresses. And there will be many incidental matters, some essential, that still have the implementation be a valid one although the theory is silent on the matter. I want to be certain that is understood about the approach I am taking.

I agree about interfaces. The closest I can use in the SML mockup is abstract data types, and I have done that and use what qualifies as interfaces in SML (the signature declarations, as in OB.sig.sml and OBAP.sig.sml). In the oMiser C/C++ runtime I have in mind, COM interfaces are intended (whether on Windows or Linux platforms).

It is unfortunate that I chose to use a kind of object-method notation in choosing ob.a, ob.b, etc. I did that before I learned SML and found, there, a conveniently-similar nomenclature. Even so, I have maintained the practice of explicitly drawing the connection between theoretical entities to software-interpretation ones. (I see some touch-ups are needed to make those alignments a little tighter.)

Again, I thank you. As long as I have been thinking about this, putting it in a form for use by others is a challenge and I welcome all tests for understanding.

@rnd0101
Copy link
Author

rnd0101 commented Jan 3, 2018

Yes, i understand the theories are also kind of formal compliance test, so many negations fit there nicely, like pair-constructor result must not be equal to constituents. Let's say you license oMiser to customers, and in order to be called "oMiser-based" implementation must pass those tests, ie. be correct.

BTW, I've recently added more to checks, and at the moment my lack of Coq knowledge prevent making more check, but what is interesting: some proofs are nearly trivial, some require so-called forward-reasoning techniques.

@rnd0101
Copy link
Author

rnd0101 commented Jan 3, 2018

I was contemplating at the beauty of theory relation with it's utility. Today's world is so much about design, that everything is "usable", that even programming languages (and I guess soon mathematical theories) will have certain "design" baseline.

So in retrospective I think some (many?) of my comments are more about that than real problems.

In this vein, I've tried to look at the project from a different angle. So, now there is a set of axioms, which together presumably (as I have not seen rigorous proof, and not expert enough to produce one) constitute at least a sound theory of computation, equivalent in power to Turing machines (and other computational models).

In the SML directory I saw a working implementation (actually, quite similar to the Coq axiomatization). There are test cases as well there (checks).

I was thinking on the oMiser as of a language (not the Frugal), but as a language, which consists of graphs (or maybe hypergraphs) from all possible hypergraphs of obs (nodes) and their relations (as theories describe).

Now to the interesting part. There is always some computational mechanism behind each language and some class of grammars. For example, regular expressions can describe regular language / grammar (IIRC, Chomsky was the first to produce one possible hierarchy of those grammars). There is also 1:1 correspondence with finite automata (deterministic), which are nice mechanisms for determining whether string belongs to regular language or not. Some very simple automata of other classes can (at first glance surprisingly) be even used for simulating Turing machines (eg, game of Life, cellular automaton).

But there is also generative aspect of a language / grammar. Grammars may be used in reverse, as mechanisms to generate "words" of the language.

Now to the point. Have you ever tried to generate "words" of the Miser "language"? That is, given some obs and lindies, automatically combining them in increasingly complex ways? Basically, to produce (enumerate) some words? (in order of length or maybe some metric of structural complexity)

I've seen works (can't recall authors now or even circumstances - it was many years ago), which described this approach for lambda-calculus, IIRC, it was using de Bruijn indices. BTW, de Bruijn's representation can be interesting in the context of obap theory (maybe something a la De Bruijn for Miser can be very fruitful idea instead of that total ordering we discussed earlier - side-note.)

There were also similar works, which mutated algorithms (represented as graphs, namely, trees), and most of those algorithms were either trivial or complete garbage, but with myriads of experiments there appeared truly interesting specimens.

Well, sorry for the lengthy comment. The more practical idea maybe to use fuzzying for testing oMiser engine. Of course, I do not imagine this is new to you, but liked to share just in case.

@orcmid
Copy link
Owner

orcmid commented Jan 3, 2018

I was thinking on the oMiser as of a language (not the Frugal), but as a language, which consists of graphs (or maybe hypergraphs) from all possible hypergraphs of obs (nodes) and their relations (as theories describe).
Now to the interesting part. There is always some computational mechanism behind each language and some class of grammars. For example, regular expressions can describe regular language / grammar (IIRC, Chomsky was the first to produce one possible hierarchy of those grammars). There is also 1:1 correspondence with finite automata (deterministic), which are nice mechanisms for determining whether string belongs to regular language or not. Some very simple automata of other classes can (at first glance surprisingly) be even used for simulating Turing machines (eg, game of Life, cellular automaton).
But there is also generative aspect of a language / grammar. Grammars may be used in reverse, as mechanisms to generate "words" of the language.

The relationship to formal languages and other models of computation is worthy of several topics on their own. Here are a few observations as placeholders.

  • The canonical obs are expressible in a context-free language. This is another way of asserting that the obs are denumerable. This is useful because it puts obs in correspondence with a solvable subset of the finite strings over some alphabet. I don't believe there is a regular language, because pairing can't be expressed properly.
  • Similarly, context-free grammars can be given for the canonical ob expressions described in ob.txt).
  • There are also context-free grammars that include the sugared expressions of obs also defined in ob.txt. These might be as simple as precedence languages or very close to it, so parsing is O(n).
  • The obs correspond to binary trees with individuals at the leaves. An ob.e(x) node simply has only a "left" child. There is a known ordinal generation of such tree shapes which should be extendable to the ob case without too much difficulty.
  • With regard to the operational semantics of eval, I am not certain what is essential. An implementation of software for which the REPL is demonstrated to satisfy the functional relationship eval:ob->ob is an engineering problem. Confirmation of fidelity to the theoretical formulation is an interesting question to entertain. How and whether "proof" assistants enter into it is worthy of discussion.

@rnd0101
Copy link
Author

rnd0101 commented Jan 4, 2018

Yes! Now that you mentioned CFGs, I understood what bothered me with negations, which are the subject of this issue: As you pointed out, there are context-free grammars for the obs expressions. And production rules of CFGs do not contain any negations, so axioms with negations can be eliminated or moved to "metatheory". That is also what I meant by using term "constructive" earlier. I want to try reformulation when I have free time.

PS. If anything, grammars are great educational tool. I still remember Pascal books, which were using nice diagrams for syntax.

@orcmid
Copy link
Owner

orcmid commented Jan 4, 2018

production rules of CFGs do not contain any negations, so axioms with negations can be eliminated or moved to "metatheory".

That is not true for the applicative-expression semantics established in obaptheory. For example, some individuals are treated as special operators in the evaluation of obs as expressions in obap.ev(p, x, exp). The special operators have to be distinguished from other obs in the leading e1 of an exp of form e1 :: e2. Negation is required to unambiguously distinguish the obs that are not occurrences of special-operator individuals.

That does not have to be reflected in the CFG. To do otherwise is even more complicated than the use of negation in the definition of the universal functions, it seems to me.

I am curious what your experience will be.

@rnd0101
Copy link
Author

rnd0101 commented Jan 4, 2018

Well. I've pushed antlr4 grammar of what I understand as a starting tree structure. (readme can be found in the same directory).

It's a very quick hack, so I most likely missed something. It's not computation procedure, it's just to have some hands-on. For example, I do not have obap.d there, and binops/unops (maybe, I should?)

The quickest way to see the result of the parsing is to use grun (java). It tags ob concepts and represents a tree by brackets: Almost enough to get an idea.

Only one conclusion so far: under this observation angle, ob/obap theories are mixing structural and computation properties. I believe, grammar already automatically solves some of the structural problems, but of course it's computation procedures, which should "discover" inconsistencies (kind of well-formed vs. valid?) Also, there are some implicit assumptions, like names for ob individuals... which may or may not be one to one with ob identities.

Plus, I lost track on obap side: Not sure what else may be beneficial to have direct language representation. Especially, I have not checked whether inverse representation of calculated result can be done with the same grammar... Some predicate functions were not used (like is-singleton, is-lindy-everywhere), I think, they are computational aspects, not structural... This grammar describes so to say "brutto". Please, correct if I overlooked something important.

This was an interesting exercise, though I am a bit tired for today to understand what are take-aways.

(and forgot to mention, it's my first hands-on with antlr, so it maybe very suboptimal)

@rnd0101
Copy link
Author

rnd0101 commented Jan 4, 2018

So basically the antlr4 grammar is about "tagging" preliminary structure, which produces something like abstract syntax tree (AST) (and I was not interested in the surface representation, but more in the tree).

And not all is-functions were needed for that, which means, that the rest are more like predicates on substructures than nodes. Not sure whether this delineation is useful in theories as well.

I mean, that is-lindy is about node of AST while is-pure-lindy / is-lindy-everywhere are more like results of analysis of the subtree (function of subtree). The is-singleton is also about subtree. These are for some reason important for the computation.

On terminology: I do not quite understand why is-lindy-everywhere is named like that. Maybe, English is not my native, but name somehow implies is-lindy-everywhere => is-lindy, that is, something is lindy and it is lindy everywhere, while the meaning most likely is lindies-are-everywhere or something like that, because it's not correct to call "a" :: "b" structure lindy. Agh, that is-pure-lindy! So pair of lindies is lindy? Or does it mean purely-lindies here? I am a bit confused now. Lindy is individual, but pair is not individual due to totality axiom, right? Isn't it a contradiction, at least, unclear terminology?

(if "lindy" was like "water", then "is-pure-water" and "is-water-everywhere" probably would make sense linguistically - we do not bother about glasses and bottles (enclosures) what we point and say it's water, but "lindy" is not matter, or is it intended?)

@rnd0101
Copy link
Author

rnd0101 commented Jan 5, 2018

New idea in the morning: Would the theory part benefit from equivalent reformulation, which is stratified into:

  1. rough structural part ("well-formed")
  2. constraints-checking part ("valid"), kind of type-safety check
  3. rules of computation

Part 1 will not have negative statements, only straightforward grammatic rules (like antlr above, but in the usual FOL= form). Part 2 will add all those predicates over structures, prohibit certain situations, etc. Part 3 will be the computation engine, which takes advantage of the "tagging" done in the parts 1 and 2.

Benefits are obvious: easier understanding of the theories, streamlined implementation of input languages, maybe, more effective inference due to unidirectional implications of part 1. Also part 1 can facilitate generating structures. Of course, the theory design would benefit from part 2 being as minimal as possible, when constraints are maximally in part 1.

@orcmid
Copy link
Owner

orcmid commented Jan 5, 2018

  1. For me, the well-formed structure part is the nature of obs as expressed in the mathematical structure ‹ob›. The domain of discourse, Ob, is given by obtheory where the only expansion mechanism is identification of further distinct individuals.
  2. For oMiser, it's unclear whether there is any constraints-checking part in any important theoretical sense. The only operational checking might be for the context-free grammar of the expression of obs in a reference notation that has concrete text syntax usable with oFrugal and useful in writing "[o]Frugalese." It is important to have an unambigous context-free grammar that includes all of the admissible sugar. The interpretation of acceptable strings as obs is fairly elementary. I suspect the grammar and the interpretation of well-formed strings as expression of obs is important and useful. Is that what you see at this level?
  3. It is quite deliberate that proposed universal functions, the related pair obap.ap(p,x) and obap.eval(exp), take Ob as domain for the arguments p, x, and exp, and Ob is also the range. It is the case that the representation of those functions in obaptheory establishes them to be computable. I am reluctant to call that rules for computation. Realization by computation is a different matter, in my estimation. It needs to be kept separate because of the insight it gives about the distinction between mathematical theorizing and engineering in reality.

I have been thinking that, for (3), one takes the obaptheory (or any further extension of it) as a mathematical theory with which a computational realization is demonstrated to be a valid interpretation of that theory. I recently heard this distinction concerning engineering expressed very well by the author of Plato and the Nerd. I recommend the 15-minute video extract, starting about 8 minutes into it.

I don't intend that one should stop at obaptheory and an oFrugal that realizes it, using a carefully verified oMiser run-time. This is the launching pad for demonstration the achievement of higher levels of abstraction represented above the oMiser level. At some point, oMIser is left behind to some degree.

For now, I want to nail down this level 3 for oFrugal/oMiser level.

@orcmid
Copy link
Owner

orcmid commented Jan 5, 2018

I agree that the predicates concerning lindies, beyond obap.is-lindy(x), don't have names that indicate what this is about. I am looking at this in respect to Question #2, and we can see what works better there.

@rnd0101
Copy link
Author

rnd0101 commented Jan 5, 2018

Very interesting video, thanks!

Formal logic is just that: rational. I'd said, it's dead, it does not produce any new information, not in the system. Dialectics (I've read somewhere Aristotle valued more than "formal logic" of his time, and even considered dangerous for humanity) is one of the logics of development, creativity, IMHO.

I see what you presented above as more zoomed out picture. I'm having my Eternal Programming goals all along (hopefully, it's ok for you), so I am trying to figuring out how to deal with the "dead logic" / machinery here first. I've not yet mastered obap side, but tried to represent that machinery in two ways already (coq and grammar), and looked into your SML one. Today I was thinking if writing it all in Prolog gives better insights and I can at last run some computations.

As for Plato vs Nerd, I kind of on the side of the latter. Math is art & design for me. Math just "has a habit" of normalizing abstractions so well, that it's very rare one creates something new under the moon. (this echoes with our earlier discussion of semigroup/monoid). Basically, all engineering behind computer science is just a way to drive along roads of "dead" formal logic (maybe, it's better to say formal logic became magic technology: it just serves almost invisibly). In this sense, low-level theory once proven sound can be left behind, and construction moved on to the upper levels with confidence in the floors underneath.

In a way I do not see clear distinction between engineering and math at these levels yet. We do not need to cope with some established practices and standards. Even hardware is a couple of levels of abstractions below all that. So one worthy additional goal I see is each level to be minimal cognitive load for those interested in upper ones.

Next I want to see how to understand level 3 and obap computing machinery (or how to have one implementation of that to experiment with). Perhaps, it's even possible to (temporarily) drop all those constraints, which prevent cyclic structures, as it makes implementation easier.

@rnd0101
Copy link
Author

rnd0101 commented Jan 7, 2018

You have wonderful writings, eg, this one, which explain a lot of motivations behind Miser to me. My usual approach, which I apologize caused a flood of questions, is from top to bottom, from philosophy to those bits. So when I do not see / can't find the top (philosophy) part, it makes me go into divergent thinking / idea generation mode.

As I stated in other post, acquaintance with Miser project for me is learning, so I had almost no opinions on theory-model-reality axis, but instead theory-theories, reconciling what I know with what is new here.

@orcmid
Copy link
Owner

orcmid commented Feb 2, 2018

Concluding that the use of negations in the axioms, using predicates that are all decidable, does not seem to be much of a problem.

There are reasons for not limiting ourselves to what constructivists might prefer, especially if it means functions are only the computable ones at the outset. We need a frame big enough to consider that there are non-computable functions, conceptually, even if they are inaccessible computationally.

This and other dialog here may be useful to know about. I am closing this Issue so that further conversation will be in a place where these topics can be treated in a first-class manner, perhaps beyond Question #7.

@orcmid orcmid closed this as completed Feb 2, 2018
@orcmid orcmid added the discussion Project discussions label Jan 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion Project discussions question
Projects
None yet
Development

No branches or pull requests

2 participants