Adrian and Chris, (01)
AW
> What are the computational complexity consequences?
> Moving from P to NP perhaps? (02)
All rules of inference are at the metalevel because they make statements
about logic. If you're writing a program to prove theorems, you will
translate that metalevel definition to executable code. Depending on
your programming skills, it can be quite efficient. (03)
If you have a problem that can be solved in polynomial time, the nomon
rules won't go beyond polynomial time. Conversely, if the problem is
NP complete, it will be NP complete with any version of logic. (Using
a restricted version of logic can't solve any problem faster  it just
makes some kinds of problems impossible to state.) (04)
CM
> I do believe though that, typically, the addition of nonmonotonic rules
> adds complexity, as they often involve consistency checking and hence
> bring with them at least the complexity of the corresponding consistency
> problem. (I'm no kind of expert on this topic.) (05)
That is a question that has been kicked around in discussions
of nonmonotonic logic. In Ray Reiter's version of default logic,
a typical nonmonotonic rule of inference has the form (06)
If P has been deduced from the current axioms,
and if Q is consistent with the the axioms,
then conclude R. (07)
That implies that every execution of a default rule such as this
would require a subproof to check whether notQ is provable. (08)
This seems rather complex compared to the more common rules like
modus ponens, but it is no more complex than some of the rules
in Gentzen's version of natural deduction, which also require
a subproof. And Gentzen's rules don't increase the complexity
class of a problem. (09)
In practice, even modus ponens will trigger subproofs. Consider,
for example, an implication of the following form: (010)
If P & Q & R, then S. (011)
In order to deduce S, this rule would trigger a subproof
for P, another subproof for Q, and another subproof for R. (012)
If you have to compute a subproof, it doesn't make a difference
in execution time whether your goal is to prove P or to show
that a proof of P fails. In either case, you do the same thing. (013)
AW
> But the implicit baseline here is roughly speaking SQL, or
> HornClause+NAF.
>
> Now replace that with *classical* negation plus a 2nd order clause
> to simulate NAF.
>
> I'm guessing the computational complexity increases. (014)
SQL queries never go beyond polynomial time, even with NAF. In fact,
the upper bound of the exponent in the polynomial is the number of
existential quantifiers in the query. But the time can be reduced
if one or more of the fields have indexes  simple queries with
indexing can often be evaluated in logarithmic time. (015)
Hornclause theorem provers with classical negation are usually slower
than Prolog with NAF  usually by a factor of two. I have never heard
of anyone using classical negation to simulate NAF. (016)
Summary: The question about NP completeness does not depend on
whether your rules require subproofs. If all your subproofs take
polynomial time, the whole proof will take polynomial time. But if
any of them are NP complete, then it's all NP complete. (017)
John (018)
_________________________________________________________________
Message Archives: http://ontolog.cim3.net/forum/ontologforum/
Config Subscr: http://ontolog.cim3.net/mailman/listinfo/ontologforum/
Unsubscribe: mailto:ontologforumleave@xxxxxxxxxxxxxxxx
Shared Files: http://ontolog.cim3.net/file/
Community Wiki: http://ontolog.cim3.net/wiki/
To join: http://ontolog.cim3.net/cgibin/wiki.pl?WikiHomePage#nid1J (019)
