Spring 2021

Towards a Theory of Learning Inductive Invariants

Monday, Mar. 29, 2021 9:30 am10:00 am

Add to Calendar


Yotam Feldman (Tel Aviv University)



SAT-based algorithms for automatically finding inductive invariants, such as McMillan's interpolation-based algorithm and Bradley's IC3/PDR, have revolutionized hardware and software verification, but their complexity is not well understood. I will talk about our attempts to shed light on such algorithms by drawing on ideas from the theory of exact concept learning with queries. Using a black-box algorithmic model, we show that rich SAT queries, as performed in PDR, can be exponentially more efficient than approaches such as ICE learning which query only for the existence of counterexamples to the inductiveness of candidates. This result also highlights differences from classical concept learning, showing that learning from counterexamples to induction is harder than classical exact learning from labeled examples, which demonstrates that the convergence rate of Counterexample-Guided Inductive Synthesis depends on the form of counterexamples. Nevertheless, we leverage concept learning algorithms by Angluin and Bshouty to obtain the first polynomial complexity result for an interpolation-based invariant inference algorithm, efficiently inferring monotone and almost-monotone DNF invariants with access to a SAT solver as an oracle. Joint work with Neil Immerman, Mooly Sagiv, Sharon Shoham, and James R. Wilcox.

PDF icon Slides2.54 MB