Student Talks 2011 (abstracts)
Static analysis for embedded SQL queries
University of Tartu, Inst. of Computer Science, Estonia
In application programs, SQL statements are usually manipulated as plain strings -- we say that SQL is embedded in the host language.
With this approach, errors in the queries (e.g. syntax errors or misspelled names) are usually detected only by testing. We describe a tool which statically analyzes SQL queries embedded into Java programs.
It uses constant propagation analysis, abstract parsing and semantic tests against actual database engine. The tool is implemented as a plug-in for Eclipse IDE.
Modes of Convergence for Infinitary Rewriting
University of Copenhagen, Dept. of Computer Science (DIKU), Denmark
In infinitary term rewriting, the usual restriction to reduction sequences of finite length is repealed in favour of a notion of convergence that provides a limit for infinite reduction sequences. This notion of convergence is commonly based on the ultrametric on terms. In our work, we have explored an alternative mode of convergence based on the partial order of partial terms and its induced limit inferior. We can show that the thus defined model of infinitary term rewriting is evidently superior to the metric approach in a number of aspects including its simplicity, its convergence and normalisation properties, as well as its ability to capture the notion of Böhm trees. Moreover, we can show that the partial order model is a conservative extension over the metric model.
Based on this approach to infinitary term rewriting, we aim to define modes of convergence for term graph rewriting in order to obtain the first formal treatment of infinitary term graph rewriting. The main goal of this endeavour is to investigate correspondences between infinitary term rewriting and graph rewriting that could enable to simulate infinite term reductions by a finite term graph reduction. To this end we explore different metric spaces as well as partial orders as candidates. While we argue in favour for a particular pair of a metric space and partial order, we also discuss potential other approaches that might result in a more appropriate structure for defining convergence for term graph reductions.
Analysis of certain quantum games
University of Latvia, Dept. of Computer Science, Latvia
Nonlocal games are used to display differences between classical and quantum world. In this paper, we study certain quantum games and make a special emphasis on the novel idea (due to Ambainis et al), of considering the worst case probability distribution of the inputs instead of well-studied uniform distribution (i.e. with a judge that always gives players the worst possible inputs for their fixed strategy). For a certain two-player and n-player games we derive tight bounds on the winning probabilities in both models. A considerable part of our analysis is devoted to the results on two-player games, rules of whom are described by a random matrix. We also give examples of explicit matrices, such that the winning probability coincides with that of a random matrix. (joint work with Andris Ambainis, Kaspars Balodis, Madars Virza)
Variable Influences of Bounded Polynomials: Towards Aaronson's Conjecture
University of Latvia, Dept. of Computer Science, Latvia
There are known quantum query algorithms for partial Boolean functions (based, say, on Shor's algorithm) that make exponentially less queries than their best classical counterparts. For total functions, however, the gap is at most polynomial. There is a folklore conjecture that for any quantum query algorithm $Q$ and any $\delta$ there exists a classical algorithm that simulates $Q$ on a $1-\delta$ fraction of inputs and uses a number of queries that is polynomial in $1/\delta$ and the number of queries made by $Q$.
In 2009 Aaronson and Ambainis [arXiv:0911.0996v1] suggested an approach towards this conjecture using the notion of variable influences. In particular, they prove that the conjecture follows from another conjecture on influences. Namely, they conjecture that if $f$ is a bounded polynomial of degree d and variance 1 then it has a variable of influence that is at least polynomial in d.
We present our work towards this conjecture. At first, we provide a looser conjecture on influences that still implies the conjecture on quantum query algorithms. Next, we argue that techniques used for proving statements similar to the conjecture on influences fail to prove this new conjecture. In particular, for any $p$, we construct a family of functions that violate a variant of our conjecture with the infinity norm replaced by p-norm.
Lambek grammars with the unit
Moscow State University, Fac. of Mathematics and Mechanics, Russia
Pentus' theorem states that any language generated by a Lambek grammar is context-free. We present a substitution that reduces the Lambek calculus enriched with the unit constant to the ordinary Lambek calculus, and use this substitution to prove that any language generated by a categorial grammar based on the Lambek calculus with the unit is context-free.
Black-Box Separations and their Adaptability to the Non-Uniform Model
Cybernetica AS, Estonia
Oracle separation methods are used in cryptography to rule out black-\-box reductions between cryptographic rimitives. It is sufficient to find an oracle relative to which the base primitive exists but there are no secure instances of the constructed primitive. In practice, it is beyond our current reach to construct a fixed oracle with such properties for most of the reductions because it is difficult to guarantee the existence of secure base primitives.
For example, to show that there exist no black-box reductions from collision-free functions to one-way permutations we have to show that secure one-way permutations exist relative to the oracle.
However, no deterministic constructions for unconditionally secure one-way permutations are known yet. To overcome this gap, randomized oracles are used to create random base primitives that are secure on average. After that, a fixed oracle with the desired properties is extracted from the probability distribution by using non-constructive probabilistic arguments and the countability of the set of adversaries. This oracle extraction argument only applies to uniform reductions because it uses the countability of the set of all uniform Turing machines. In this work, we study how to adapt oracle separation results to the non-uniform security model. We consider the known separation techniques that are capable of ruling out the so-called fully black-box reductions and a certain strong form of semi black-box reductions (in the non-uniform model) where the construction of the simulator does not depend on the particular instance of the base primitive f. In this work, we study how to go beyond the barrier of strong semi black-box reductions and show that this is indeed possible by using random oracles with auxilliary advice. For that end, we prove a conjecture of Unruh 2007 about presampling being a sufficient substitute for advice for any oracle distribution.
Ties Matter: Complexity of Voting Manipulation Revisited
St. Petersburg Department of Steklov Institute of Mathematics, Russia
In their seminal paper "The computational difficulty of manipulating an election" Bartholdi, Tovey and Trick argued that many well-known voting rules, such as Plurality, Borda, Copeland and Maximin are easy to manipulate. An important assumption made in that paper is that the manipulator's goal is to ensure that his preferred candidate is among the candidates with the maximum score, or, equivalently, that ties are broken in favor of the manipulator's preferred candidate. In our work, we examine the role of this assumption in the easiness results of Bartholdi et al.
We show that the algorithm presented in their paper extends to all rules that break ties according to a fixed ordering over the candidates. In contrast, when arbitrary polynomial-time computable tie-breaking rules are allowed, manipulation becomes hard for all rules considered in the paper except Plurality, as well as for a large class of scoring rules and Bucklin.
We then study the complexity of manipulation under randomized tie-breaking rules, where the winner is selected among the top-scoring candidates uniformly at random. For this setting, we show that manipulation is easy for all scoring rules, as well as for Bucklin and, for a natural special class of manipulator's preferences, for Maximin; however, it is hard for Maximin with general preferences as well as for Copeland.
This is joint work with Edith Elkind (NTU, Singapore).
Part of this work has been accepted to AAMAS'11.
About Boolean functions with a low polynomial degree
University of Latvia, Inst. of Math. and Inform., Latvia
Quantum computing study quantum query algorithms on computing Boolean functions that uses theese quantum queries. It is known that number of queries that is necessary for theese algorithms is associated with degree of polynomials that represents corresponding Boolean functions. Degree of theese polynomials are called degree of these functions.
If degree of some Boolean function is low (comparing with number of variables of this function) then, probably, quantum query algorithm need small number of queries for to evaluate this function. Therefore it is important to construct Boolean functions with a low polynomial degree and high deterministic complexity.
We consider several such functions with 7 and 8 variables and degree 4 that I obtained in 2004, using graphs and polynomials. We also consider some functions I found later, results that obtained other authors and open problems.
Reasoning about correctness of transactional memory
Institute of Cybernetics at TUT, Estonia
Transactional memory, as first described by Herlihy and Moss, provides a convenient concurrency control method by allowing developers to designate blocks of code that should be run apparently atomically, without interference from other transactions. We introduce a TM implementation by extending the simple imperative language While with a parallel-construct and nested transactions and endow it with a "weak" small-step operational semantics that models optimistic concurrency control. In order to show the correctness of our TM implementation, we build upon the correctness criterion of opacity which formally captures the intuitive notion that (a) all operations by a successful transaction should appear to happen atomically, (b) aborted transactions are not visible to other transactions and (c) every transaction observes a consistent state of the system. More precisely, we also define a "strong" operational semantics that executes transactions literally atomically (in one small step), hence sequentially. We show that every run in the "weak" model has a corresponding run in the "strong" model and thus satisfies the correctness criterion of opacity. We have formalized the development in the dependently typed programming language Agda.
(Joint work with Tarmo Uustalu.)
March 2, 2013 17:06 EET
local organizers, ewscs11(at)cs.ioc.ee
EWSCS'11 page: http://cs.ioc.ee/ewscs/2011/