[math-fun] Simplifying IF
Here's a Logic & Programming challenge: Simplify IF expressions. The general problem is NP-complete, since nested IF expressions can easily encompass Satisfiability problems. It might be even harder, since simplifying an IF expression down to True or False (when applicable) would require a short (or at least feasible) proof of equivalence, while SAT requires only an example. Nevertheless, it's reasonable to ask for partial solutions. The "simpler-than" ordering includes a lot of beholder input, and also depends on the application at hand. I'll offer a few "rules" to start the bidding. If(A,B,C) with A a predicate, B and C arbitrary types (usually the same), has the value B when A is true, and C when A is false. Other notations include A?B:C, Choose(A,B,C), Boole(A,C,B), with Choose and Boole sometimes interpreted per individual bit. If(True,B,C) -> B If(False,B,C) -> C If(A,B,B) -> B If(A,B,~B) -> A xor B (B a predicate) If(A,B,C) <-> If(~A,C,B) If(A,B,C) <-> If(A, A&B, (~A)&C ) If(A,B,C) <-> B + If(A,0,C-B) (B and C numerical) If(A,B&C,BvC) <-> Majority(A,B,C) (B and C predicates) If(A<B,A,B) <-> Min(A,B) If(A&B,C,D) <-> If(A,If(B,C,D),D) A similar challenge can be made for Loop expressions. We've adopted a convention for some binary operations, to represent a variable-length version as a kind of summation, using a larger or boldface version of the operator, with upper & lower limits specifying a range, and an expression for argument_k. Would it be useful to change to a generic Loop expression? Most of the simple manipulations for sums and products are applicable to Loops too. Rich
The "Boyer Benchmark" in the original Gabriel Lisp benchmarks is an if-simplifying benchmark. It's quite interesting -- I highly recommend looking at it to see what it does. The article below debugs the benchmark code, so it at least does what it is supposed to do -- check that tautologies are tautologies. I don't have a reference on Boyer's tautology algorithm itself. http://home.pipeline.com/~hbaker1/BoyerB.ps.gz At 11:37 AM 3/18/2011, rcs@xmission.com wrote:
Here's a Logic & Programming challenge:
Simplify IF expressions.
The general problem is NP-complete, since nested IF expressions can easily encompass Satisfiability problems. It might be even harder, since simplifying an IF expression down to True or False (when applicable) would require a short (or at least feasible) proof of equivalence, while SAT requires only an example.
Nevertheless, it's reasonable to ask for partial solutions. The "simpler-than" ordering includes a lot of beholder input, and also depends on the application at hand.
I'll offer a few "rules" to start the bidding.
If(A,B,C) with A a predicate, B and C arbitrary types (usually the same), has the value B when A is true, and C when A is false. Other notations include A?B:C, Choose(A,B,C), Boole(A,C,B), with Choose and Boole sometimes interpreted per individual bit.
If(True,B,C) -> B If(False,B,C) -> C If(A,B,B) -> B If(A,B,~B) -> A xor B (B a predicate) If(A,B,C) <-> If(~A,C,B) If(A,B,C) <-> If(A, A&B, (~A)&C ) If(A,B,C) <-> B + If(A,0,C-B) (B and C numerical) If(A,B&C,BvC) <-> Majority(A,B,C) (B and C predicates) If(A<B,A,B) <-> Min(A,B) If(A&B,C,D) <-> If(A,If(B,C,D),D)
A similar challenge can be made for Loop expressions. We've adopted a convention for some binary operations, to represent a variable-length version as a kind of summation, using a larger or boldface version of the operator, with upper & lower limits specifying a range, and an expression for argument_k. Would it be useful to change to a generic Loop expression? Most of the simple manipulations for sums and products are applicable to Loops too.
Rich
I think you got your xor production backwards; should be If(A,~B,B). I'm not sure what it takes to play. For inspiration, you might look at the first few hundred theorems at Metamath.org. How about If(A, true, false) <-> A? What would be nice would be to come up with a complete axiom set for If, so that you could prove any tautology from the axioms. Again, Metamath might suggest answers for this. On Fri, Mar 18, 2011 at 2:37 PM, <rcs@xmission.com> wrote:
Here's a Logic & Programming challenge:
Simplify IF expressions.
The general problem is NP-complete, since nested IF expressions can easily encompass Satisfiability problems. It might be even harder, since simplifying an IF expression down to True or False (when applicable) would require a short (or at least feasible) proof of equivalence, while SAT requires only an example.
Nevertheless, it's reasonable to ask for partial solutions. The "simpler-than" ordering includes a lot of beholder input, and also depends on the application at hand.
I'll offer a few "rules" to start the bidding.
If(A,B,C) with A a predicate, B and C arbitrary types (usually the same), has the value B when A is true, and C when A is false. Other notations include A?B:C, Choose(A,B,C), Boole(A,C,B), with Choose and Boole sometimes interpreted per individual bit.
If(True,B,C) -> B If(False,B,C) -> C If(A,B,B) -> B If(A,B,~B) -> A xor B (B a predicate) If(A,B,C) <-> If(~A,C,B) If(A,B,C) <-> If(A, A&B, (~A)&C ) If(A,B,C) <-> B + If(A,0,C-B) (B and C numerical) If(A,B&C,BvC) <-> Majority(A,B,C) (B and C predicates) If(A<B,A,B) <-> Min(A,B) If(A&B,C,D) <-> If(A,If(B,C,D),D)
A similar challenge can be made for Loop expressions. We've adopted a convention for some binary operations, to represent a variable-length version as a kind of summation, using a larger or boldface version of the operator, with upper & lower limits specifying a range, and an expression for argument_k. Would it be useful to change to a generic Loop expression? Most of the simple manipulations for sums and products are applicable to Loops too.
Rich
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
Rich, et al: Actually, one of the hardest parts of IF simplification -- in compilers, at least -- is the fact that various things aren't allowed to be evaluated if the boolean fails. With "and" and "or", you aren't even allowed to evaluate further than necessary to conclude T or F. This isn't mere pedantry. A good deal of modern computer hardware is devoted to "optimistic/speculative execution", in which _both_ arms of the conditional are evaluated in parallel and are available for use when the boolean part finishes. Any side-effect of the losing arm of the execution is reversed (or simply never stored), and hence thrown away. Speculative execution is coming under some criticism these days because it tends to waste _power_ & _energy_, and hence can increase the heat load that needs to be removed. Compilers still tend to push the limit when moving stuff out of conditionals & loops. Compilers are willing to compute indices of arrays _in advance_ of checking for the end of the array, so modern compilers may have to allocate arrays with a certain amount of "margin" to keep from causing a page fault or segmentation fault if some elements are accessed beyond the end of the array. IEEE "NaN's" were developed to avoid interrupting floating point units, with the hope that these "numbers" would later be discarded. I don't know that there is a complete & consistent answer for these "partial function" issues. As hardware continues to get more aggressive, the boundaries for speculation continue to get pushed back. At 11:37 AM 3/18/2011, rcs@xmission.com wrote:
Here's a Logic & Programming challenge:
Simplify IF expressions.
The general problem is NP-complete, since nested IF expressions can easily encompass Satisfiability problems. It might be even harder, since simplifying an IF expression down to True or False (when applicable) would require a short (or at least feasible) proof of equivalence, while SAT requires only an example.
Nevertheless, it's reasonable to ask for partial solutions. The "simpler-than" ordering includes a lot of beholder input, and also depends on the application at hand.
I'll offer a few "rules" to start the bidding.
If(A,B,C) with A a predicate, B and C arbitrary types (usually the same), has the value B when A is true, and C when A is false. Other notations include A?B:C, Choose(A,B,C), Boole(A,C,B), with Choose and Boole sometimes interpreted per individual bit.
If(True,B,C) -> B If(False,B,C) -> C If(A,B,B) -> B If(A,B,~B) -> A xor B (B a predicate) If(A,B,C) <-> If(~A,C,B) If(A,B,C) <-> If(A, A&B, (~A)&C ) If(A,B,C) <-> B + If(A,0,C-B) (B and C numerical) If(A,B&C,BvC) <-> Majority(A,B,C) (B and C predicates) If(A<B,A,B) <-> Min(A,B) If(A&B,C,D) <-> If(A,If(B,C,D),D)
A similar challenge can be made for Loop expressions. We've adopted a convention for some binary operations, to represent a variable-length version as a kind of summation, using a larger or boldface version of the operator, with upper & lower limits specifying a range, and an expression for argument_k. Would it be useful to change to a generic Loop expression? Most of the simple manipulations for sums and products are applicable to Loops too.
Rich
="rcs@xmission.com" <rcs@xmission.com> Simplify IF expressions.
Very interesting ideas! Of course, as you well-know, If(A,B,C) <-> A&B + (~A)&C. So, if you can simplify and/or expressions you can simplify IF expressions (mod, as you put it, the eye of the beholder). An amusing idea might be the converse: rewriting logic expressions entirely in If (writing 0 for false and 1 for true). So, extending your list: A&B <-> If(A,B,0) <-> If(A,B,A) ~A <-> If(A,0,1) hence A+B <-> If(If(A,0,1),If(B,0,1),0,1) (is there something "simpler"?) And of course you can also make a more compact syntax for "case" branches similar to Lisp's COND by expanding converting the else-ifs into nested Ifs. Another interesting thing is repeated substitution 0 <-> If(A,0,0) <-> If(A,If(A,0,0),If(A,0,0)) <-> ... Thus allowing 0 to be defined as a sort of fixpoint of the If function.
Other notations include A?B:C, Choose(A,B,C), Boole(A,C,B),
In a domain-reduction based constraint reasoner I built a few years back, I adopted ?(A,B,C) as a synonym for If(A,B,C). <aside> That system represented domains as intervals. The Boolean interval was 0..1, so [0 0] was false, [1 1] was true and [0 1] was maybe. The Boolean operations were then just a subset of the interval arithmetic (although, oddly, "implies", often written in logic as A => B, turns out to be A <= B arithmetically). (Obviously this can also be fuzzy, if you swing that way.) This may seem academic, but the unification was hugely useful in commercial decision support applications, where the business modelers could just code up their rules and policies as a bunch of composed functions that described the required state, rather than "programming commands" in some icky opaque imperative way. </aside>
A similar challenge can be made for Loop expressions... using a larger or boldface version of the operator... Would it be useful to change to a generic Loop expression? Most of the simple manipulations...are applicable to Loops too.
Yes, I think this would be an excellent idea to pursue! Back in the early 70's, influenced by exposure to APL, Macsyma etc, I experimented with notations that tried to meld math and programming, in which I wrote Loop as a capital lambda. Unfortunately I've seen that also used for the more specialized operator that is to inclusive-or what capital sigma is to +. Earlier still, it always seemed somehow wrong that my geometry textbook was full of case arguments in proofs, but the analogous pattern for "formulas" were missing from algebra. And later, something of a kludge to select a member of a set of expressions by summing them all together and then multiplying all but one by zero. (By the way, that zero has to be a "damn strong" zero, since it often needs to annihilate expressions that would otherwise be x/0 etc. See also Henry Baker's comments in this thread on speculative execution). Yes, why not have a notation for writing algorithms that was more friendly to algebraic-style manipulations? Or maybe some strange hybrid of APL and Lisp? So-called "concatenative" programming languages such as Forth, Postscript and Joy also come to mind (especially Joy).
Simpler or: If(A,1,If(B,1,0)). On Fri, Mar 18, 2011 at 5:46 PM, Marc LeBrun <mlb@well.com> wrote:
="rcs@xmission.com" <rcs@xmission.com> Simplify IF expressions.
Very interesting ideas!
Of course, as you well-know, If(A,B,C) <-> A&B + (~A)&C. So, if you can simplify and/or expressions you can simplify IF expressions (mod, as you put it, the eye of the beholder).
An amusing idea might be the converse: rewriting logic expressions entirely in If (writing 0 for false and 1 for true). So, extending your list:
A&B <-> If(A,B,0) <-> If(A,B,A) ~A <-> If(A,0,1) hence A+B <-> If(If(A,0,1),If(B,0,1),0,1) (is there something "simpler"?)
And of course you can also make a more compact syntax for "case" branches similar to Lisp's COND by expanding converting the else-ifs into nested Ifs.
Another interesting thing is repeated substitution
0 <-> If(A,0,0) <-> If(A,If(A,0,0),If(A,0,0)) <-> ...
Thus allowing 0 to be defined as a sort of fixpoint of the If function.
Other notations include A?B:C, Choose(A,B,C), Boole(A,C,B),
In a domain-reduction based constraint reasoner I built a few years back, I adopted ?(A,B,C) as a synonym for If(A,B,C).
<aside>
That system represented domains as intervals. The Boolean interval was 0..1, so [0 0] was false, [1 1] was true and [0 1] was maybe. The Boolean operations were then just a subset of the interval arithmetic (although, oddly, "implies", often written in logic as A => B, turns out to be A <= B arithmetically). (Obviously this can also be fuzzy, if you swing that way.)
This may seem academic, but the unification was hugely useful in commercial decision support applications, where the business modelers could just code up their rules and policies as a bunch of composed functions that described the required state, rather than "programming commands" in some icky opaque imperative way.
</aside>
A similar challenge can be made for Loop expressions... using a larger or boldface version of the operator... Would it be useful to change to a generic Loop expression? Most of the simple manipulations...are applicable to Loops too.
Yes, I think this would be an excellent idea to pursue!
Back in the early 70's, influenced by exposure to APL, Macsyma etc, I experimented with notations that tried to meld math and programming, in which I wrote Loop as a capital lambda. Unfortunately I've seen that also used for the more specialized operator that is to inclusive-or what capital sigma is to +.
Earlier still, it always seemed somehow wrong that my geometry textbook was full of case arguments in proofs, but the analogous pattern for "formulas" were missing from algebra.
And later, something of a kludge to select a member of a set of expressions by summing them all together and then multiplying all but one by zero. (By the way, that zero has to be a "damn strong" zero, since it often needs to annihilate expressions that would otherwise be x/0 etc. See also Henry Baker's comments in this thread on speculative execution).
Yes, why not have a notation for writing algorithms that was more friendly to algebraic-style manipulations?
Or maybe some strange hybrid of APL and Lisp? So-called "concatenative" programming languages such as Forth, Postscript and Joy also come to mind (especially Joy).
_______________________________________________ math-fun mailing list math-fun@mailman.xmission.com http://mailman.xmission.com/cgi-bin/mailman/listinfo/math-fun
="Marc LeBrun" <mlb@well.com> operator that is to inclusive-or what capital sigma is to +
Doh, should have been "...is to & what..."
="Allan Wechsler" <acwacw@gmail.com> Simpler or: If(A,1,If(B,1,0)).
Alan, thanks! Interesting that the simplest way to write a commuting operator isn't symmetrical!
Converting AND/OR/NOT to IF is precisely what Bob Boyer's theorem prover did to prove tautologies. At 02:46 PM 3/18/2011, you wrote:
An amusing idea might be the converse: rewriting logic expressions entirely in If (writing 0 for false and 1 for true). So, extending your list:
A&B <-> If(A,B,0) <-> If(A,B,A) ~A <-> If(A,0,1) hence A+B <-> If(If(A,0,1),If(B,0,1),0,1) (is there something "simpler"?)
participants (4)
-
Allan Wechsler -
Henry Baker -
Marc LeBrun -
rcs@xmission.com