# Program of AVM / COST Meeting in Lugano

At the University in Lugano, collocated with FMCAD 2010. Click Here for More Background

Like FMCAD, the event takes place in the Aula Magna convention hall of the University of Lugano.

## Monday, October 18, 2010

08:15 **REGISTRATION** in front of the conference hall

09:00 (15min) **Viktor Kuncak**: Welcome and Opening

09:00 (45min) **Sumit Gulwani**: Program Synthesis for Automating Education There has been recent surge in research on program synthesis and it has been applied to a wide variety of scenarios such as discovery of new algorithms, easing software development tasks, and end-user programming. However, one of the most revolutionary impact of program synthesis technology can be for problem solving in the context of automating education. If a computer is smart enough to automatically solve problems in subjects such as mathematics, physics, chemistry, etc., the impact that it can have in education is completely mind boggling. A student would have an automated interactive tutor 24 hours a day - the automated tutor can generate new problems of a measured difficulty level, give hints to students in problem solving, automatically grade solutions, point out errors, suggest fixes, etc. In this talk, I will describe the role that logic, PL, and program synthesis are set to play in this initiative. I will primarily focus on our recent work on synthesis of high-school geometry constructions.

10:00 (30min) **BREAK**

10:30 (30min) **Jean-Francois Raskin**: Compositional Algorithms for LTL Synthesis In this paper, we provide two compositional algorithms to solve safety games and apply them to provide compositional algorithms for the LTL synthesis problem. We have implemented those new compositional algorithms, and we demonstrate that they are able to handle full LTL specifications that are orders of magnitude larger than the specifications that can be treated by the current state of the art algorithms. (joint work with Emmanuel Filiot and Nayiong Jin, to be presented at ATVA 2010)

11:00 (30min) **Ashutosh Gupta**: Constraint solving for branching counterexamples Applying CEGAR technique for the verification of multi-threaded programs, programs with procedures, and higher order functional programs produces counterexamples with branching, which is more complex than a sequence of program statements produced as counterexample when verifying sequential programs. As a result, the branching makes currently available refinement techniques, in particular interpolation, are not applicable to compute refinement from such counterexamples. In this talk, I will present a formulation of branching counterexamples as Horn clauses together with an algorithm to solve the corresponding constraints. (Joint work with Andrey Rybalchenko.)

11:30 (30min) **Florian Lonsing**: Practical Aspects of Dependency Schemes in QBF Solving Prenex conjunctive normal form (PCNF) is a common format for quantified boolean formulae (QBF). In practice, search-based QBF solvers typically suffer from the quantifier prefix of PCNFs because it restricts the set of admissible orderings of variable assignments: in general variables in the prefix have to assigned from left to right. Dependency schemes are an approach to overcome this problem. Built on top of a theoretical framework based on the semantics of QBF, dependency schemes are relations over the set of variables which allow to identify independence. Independence between variables relaxes the strong linear prefix ordering, thus a partial ordering can be obtained. This can grant QBF solvers more freedom during the solution process. Specific dependency schemes can be constructed by carrying out a syntactic analysis of a PCNF. In our QBF solver DepQBF, which was placed first in QBFEVAL'10, we implemented this approach. We want to briefly introduce dependency schemes and focus on practical aspects that arise from a combination of dependency schemes and the DPLL algorithm for QBF, which is part of DepQBF. In particular, we consider the resolution rule for QBF, clause/cube learning and detection of unit literals. Due to the general nature of dependency schemes, applications are not limited to DPLL. We therefore want to address further independent research directions and open questions related to preprocessing QBFs.

12:00 (120min) **LUNCH** (MC discussions may start)

14:00 (100min) **MC Meeting for COST MC Members** * For the rest: informal demos, posters in front of the conference hall. *

15:30 (30min) **BREAK**

16:00 (30min) **Simone Fulvio Rollini**: An Efficient and Flexible Approach to Resolution Proof Reduction
A propositional proof of unsatisfiability is a certificate of the
unsatisfiability of a Boolean formula. Resolution proofs, as generated by
modern SAT solvers, find application in many verification techniques
where the size of the proofs considerably affects efficiency. This paper
presents a new approach to proof reduction, situated among the purely
post-processing methods. The main idea is to reduce the proof size by
eliminating redundancies of occurrences of pivots along the proof paths.
This is achieved by matching and rewriting local contexts into simpler
ones. In our approach, rewriting can be easily customized in the way
local contexts are matched, in the amount of transformations to be
performed, or in the different application of the rewriting rules. We provide
experimental evaluation of our technique on a set of SMT and SAT
benchmarks, which shows considerable reduction in the proofs size.

16:30 (30min) **Andreas Holzer**: Guided Whitebox Testing We describe how coverage criteria specified in the test specification language FQL can be translated into sets of finite automata in order to efficiently drive test input generation as well as coverage measurement. The alphabet of these automata ranges over state predicates and sets of edges in the control flow automaton representing the program under scrutiny. We use these automata to guide the reachability analysis of a software model checker. We implemented our test input generation approach based on the software model checking framework CPAchecker. The configurable program analysis framework implemented in CPAchecker allows us to achieve the precision needed for test input generation in an elegant and efficient way. Hereby, we can exploit recently developed efficient encodings of large code blocks. We are also able to determine the coverage achieved by a given test suite. In our experiments, we compare our implementation against existing test input generators. (Joint work with Dirk Beyer, Michael Tautschnig and Helmut Veith.)

17:00 (30min) **Stefano Tonetta**: Formal methods for requirements validation Requirement engineering is a fundamental phase of the development process of software systems. Mature methodologies guide the engineers through the elicitation, analysis, and validation of the requirements. Unfortunately, the validation techniques are mostly based on manual, syntactic effort, and often ignore the semantics of the requirements. Thus, they provide rather shallow analysis capabilities, and may miss subtle flaws resulting from complex interactions among the requirements. We address this problem by supporting the validation of embedded systems requirements with formal methods. Unlike the verification of the design, many formal techniques have been conceived and applied, the research on formal methods for requirements validation is not yet mature. The main obstacles are that, first, capturing the semantics of the requirements requires very expressive logical formalisms; second, the correctness of requirements is not formally defined; third, both the formalization and the validation of the requirements usually demands a strong involvement of domain experts. In a recent series of papers, we proposed a new language, called Othello, and a methodology to formalize and validate the requirements of embedded systems. The language allows specifying constraints on the possible system configurations and their temporal evolutions, and describing the continuous evolution of the physical entities to capture the assumptions on the environment. Formal verification techniques allow executing the requirements showing behaviors compatible with the specification. In this talk, I will present the language Othello and the related research directions.

17:30 (30min) **Cesar Sanchez**: Decision Procedures for Concurrent Skiplists We describe a concurrent skip list data structure and present a decision procedure that allows the
automatic proof of verification conditions in temporal proofs of the data structure.
Our starting point is a previous decision procedure for linked lists, TLL3.

## Tuesday, October 19, 2010

08:30 (30min) **Larent Doyen**: Energy Parity Games Energy parity games are infinite two-player turn-based games played on weighted graphs.
The objective of the game combines a (qualitative) parity condition with
the (quantitative) requirement that the sum of the weights (i.e., the level of energy
in the game) must remain positive.
Besides their own interest in the design and synthesis of resource-constrained
omega-regular specifications, energy parity games provide one of the simplest
model of games with combined qualitative and quantitative objective.
Our main results are as follows:
(a) exponential memory is sufficient and may be necessary for winning strategies
in energy parity games;
(b) the problem of deciding the winner in energy parity games can be solved
in NP \cap coNP; and
© we give an algorithm to solve energy parity by reduction to energy games.
We also show that the problem of deciding the winner in energy parity games is
polynomially equivalent to the problem of deciding the winner in mean-payoff
parity games, which can thus be solved in NP \cap coNP.
As a consequence we also obtain a new conceptually simple algorithm to solve
mean-payoff parity games.

09:00 (30min) **Alessandro Cimatti**: The NuSMV project: status and roadmap NuSMV is a widely used symbolic model checker developed in FBK. In recent years, has been extended with diverse functionalities, including SMT-based verification of timed and hybrid systems, safety and dependability analysis, parametric verification. Such extensions have been used in several real-world applications. In this talk I will describe the current status of the system, some significant applications, and will present the roadmap for future releases.

09:30 (30min) **Pavol Cerny**: Algorithmic Verification of Single-Pass List Processing Programs
We identify a class of programs manipulating lists of data items for
which checking functional equivalence and pre/post verification
conditions is decidable. Lists are modeled as data words,
(unbounded) sequences of data values, tagged with symbols from a finite
set, over a potentially infinite data domain that supports only the
operations of equality and ordering. First, we introduce streaming
data-word transducers that map input data words to output data words
in a single left-to-right pass in linear time. The transducer uses a
finite set of states, a finite set of variables ranging over data
domain, and a finite set of variables ranging over data words.
At every step, it can make decisions based on the next input symbol,
updating its state, remembering the input data value in its data
variables, and updating data-word variables by concatenating data-word
variables and new symbols formed from data variables, while avoiding
duplication. Second, we establish PSPACE bounds for the problems
of checking functional equivalence of two streaming transducers, and of
checking whether a streaming transducer satisfies pre/post verification
conditions specified by streaming acceptors over input/output
data-words. Third, we identify a class of imperative programs that
manipulate heap-allocated single-linked list data structure, that is
expressively equivalent to streaming transducers. Such programs
dynamically modify the heap by adding new nodes and changing
next-pointers of heap-nodes, but are restricted in how the next-pointers
can be used for traversal. Finally, we identify an expressively
equivalent fragment of list-processing functional programs with
syntactically restricted recursive calls. Our results lead to algorithms
for checking functional equivalence of two programs, written possibly in
different programming styles, for commonly used routines such as insert,
delete, and reverse. (Joint work with Rajeev Alur.)

10:00 (30min) **BREAK**

10:30 (30min) **Filip Konecny**: Relational Analysis of Non-deterministic Integer Programs Computing transitive closures of integer relations is the key to finding precise invariants of integer programs. In this talk, we describe an efficient algorithm for computing the transitive closures of difference bounds, octagonal and finite monoid affine relations. On the theoretical side, this framework provides a com- mon solution to the acceleration problem, for all these three classes of relations. In practice, according to our experiments, the new method performs up to four orders of magnitude better than the previous ones for the first two classes. This algorithm provides the basis for the analysis of non-deterministic integer programs (also known as counter automata or counter systems). We explain how transitive closure computation combines with other methods (quantifier elimination, abstraction, etc.) in order to provide a framework for the analysis of such systems. These ideas led us to the implementation of the FLATA tool. (Joint work with Marius Bozga, Radu Iosif, and Tomas Vojnar.)

11:00 (30min) **Ruzica Piskac**: Reasoning about Collections - Decision Procedures and Applications Applications in software verification and interactive theorem proving often involve reasoning about sets of objects. Cardinality constraints on such collections also arise in these scenarios. In this talk we will briefly explain why the satisfiability problem for a logic of sets, multisets (bags), and cardinality constraints is decidable. We will also describe the optimal complexity results. It relies on an extension of integer linear arithmetic with a “star” operator, which takes closure under vector addition of the solution set of a linear arithmetic formula. We show that the satisfiability problem for this extended language remains in NP (and therefore NP-complete). Our proof uses semilinear set characterization of solutions of integer linear arithmetic formulas, as well as a generalization of a recent result on sparse solutions of integer linear programming problems. In addition to verification, we also report on the use of these decision procedures in synthesis.

11:30 (45min) **Dirk Beyer**: Adjustable-Block Encoding: Towards a Unified Framework for Software Verification
Several successful software model checkers are based on a technique called single-block encoding (SBE),
which computes costly predicate abstractions after every single program operation.
Large-block encoding (LBE) computes abstractions only after a large number of operations,
and it was shown that this significantly improved the verification performance.
In this work, we present adjustable-block encoding,
a unifying framework that allows to express both previous approaches.
In addition, it provides the flexibility to specify any block size between SBE and LBE,
and also beyond LBE, through the adjustment of several parameters.
Such a unification of different concepts makes it easier to understand
the fundamental properties of the analysis, and makes the differences of the variants more explicit.
Adjustable-block encoding (ABE) is implemented as a configurable program analysis (CPA) in
the verification framework CPAchecker.
We will explain the architecture of that software and demonstrate how to
plug-in new program analyses into the framework. (Joint work with Philipp Wendler.)

12:15 (105min) **LUNCH** at Canvetto Luganese, Via R. Simen 14b, 6900 Lugano

14:00 (30min) **Sergio Mover**: Compositional Reachability of Hybrid Systems Hybrid automata provide a comprehensive framework for the
representation of practical systems with discrete and continuous
dynamics. Their interaction is paradigmatic of partially synchronized
systems, which may run asynchronously or synchronize on common
events. In this talk, I will present a new compositional approach to the
reachability analysis of hybrid systems.
The role of compositionality is to reduce the effort of building a
path in the network of systems, by replacing a monolithic search on
the whole network with a set of searches on suitably approximated
components. The proposed method selects a component in the network under analysis,
and applies, to each of the other components, either an
over-approximation or an under-approximation. Intuitively, the
over-approximations can be seen as abstractions of the environment
visible to the selected component, while the under-approximations
impose constraints resulting from previously found partial solutions.
The approach carries out a backtracking search in the space of network
approximations. I will present a possible instantiation tailored to bounded reachability,
which interacts with a Satisfiability Modulo Theory (SMT) solver. In
order to exploit the incremental interface of the SMT solver, we need
to cope with the two directions of the search: on one hand, we
increment and back-track the length of the path; on the other hand, we
refine and back-track the approximation of the components.
The incrementality is enabled by using constraints-based
approximations and assumptions to activate constraints on demand.

14:30 (30min) **Hugo Herbelin**: A short and constructive proof of Gödel's completeness theorem We analyze the computational content of (Krivine's constructive formulation of Henkin's proof of) Gödel's completeness theorem and finally obtain a very short (and executable) proof. We give some hints on how we could eventually get rid of the need for an enumeration of the formulas by using “side-effects” in the proof. (Joint work with Danko Ilik)

15:00 (30min) **BREAK**

15:30 (30min) **Predrag Janicic**: DPLL-Based Theorem Prover for Coherent Logic
We will talk about an ongoing work on a theorem prover
for coherent logic (CL), being built as an extension of a
modern DPLL-based SAT solver. It benefits from
algorithmic and implementation techniques introduced by
SAT solvers, and also from formalized properties of SAT
solvers to ensure its correctness. On the other hand,
the theorem prover benefits from expressiveness of CL
that is not available in SMT and resolution proofs
(CL allows use of existential quantifiers). Thanks to
this, readable proofs can be often obtained and also
trivially formalized. (Joint work with Mladen Nikolic.)

16:00 (30min) **Alberto Griggio**: A Practical Approach to SMT(LA(Z)) We present a detailed description of a theory solver for Linear Integer
Arithmetic in a lazy SMT context.
Rather than focusing on a single technique that guarantees theoretical
completeness, the solver makes extensive use of layering and heuristics
for combining different techniques in order to achieve good performance
in practice.
The viability of our approach is demonstrated by an empirical evaluation
on a wide range of benchmarks, showing significant performance
improvements over current state-of-the-art solvers.

** END OF TALKS **

16:45 Stroll from the workshop location along the lake to the San Salvatore funicular

17:30 Funicular ride to the top of San Salvatore mountain (http://montesansalvatore.ch/en/)

17:45 Short hike (walk to the top of the mountain) for those who are interested, the rest can stay at the bar in the restaurant

19:00 Dinner

22:00 Funicular ride going down to Paradiso