University Home | Department Home | Research | Department
People | Prizes | Industrial Programme | Admissions | Courses
next, previous, and forthcoming events, organisation; 09-02-06, 16-02-06, 23-02-06, 16-03-06, 28-03-06, 30-03-06, 04-04-06, 19-05-06,
The idea for this seminar grew amongst members of our theory group who felt a need for a platform to present our work of our growing group on Proofs, Complexity and Verification. The idea is to have a high level research seminar where we can directly pop in to our research to present and discuss our latest ideas. In the beginning of 2005, this idea has turned into the PCV Seminar.
Everybody interested in any of the topics on Proofs, Complexity and Verification is kindly invited to attend and to contribute to the seminar.
The PCV page has moved, it can now be found under the new (temporarly) websites of our department.
|1-5 July 2006||CiE 2006|
From Universal Algebra to Lawvere Theories
"Lawvere Theories" introduced by by F.William Lawvere in his PhD thesis in 1963 are an outcome of three crucial observations concerning Universal Algebra: An algebra is an interpretation of syntactical entities in the semantical domain of sets. The syntactical entities "signature", "variable", and "term" reflect the formation of cartesian products of sets. And substitution of variables by terms reflects the composition of term functions.
In the talk we show how a many-sorted algebraic signature SIG gives rise to a Lawvere Theory C(SIG) with products where the objects are lists of sort names, the morphisms are (tuples of) terms, and composition is given by substitution. This categorical reformulation turns SIG-algebras into product preserving functors from C(SIG) into the category SET and SIG-homomorphisms into natural transformations between those functors. Moreover, term algebras are reconstructed as hom-functors and the "recursion theorem" (uniqueness of term evaluation) turns out to be an instance of Yoneda's Lemma. This construction can be extended to equational specifications (SIG,EQ) if we consider equivalence classes of (tuples of) terms as morphisms.
We will discuss some benefits (and drawbacks) of this categorical reformulation of Universal Algebra. And, if there is time left, we will sketch some more topics as Tensor Products of Theories, 2-categorical Lawvere Theories (the categorical counterpart of term rewriting) and Sketches (the categorical counterpart of equational specifications).
|4-7 April 2006||BCTCS 2006|
|04-04-06||Jan Johannsen: An infinite
hierarchy in the linear time mu-calculus
The notion of the next-rank of a formula in the linear time mu-calculus muTL is introduced, and the fragments muTL(r) of formulas of next-rank at most r are studied. It is shown that muTL(1) is equivalent to LTL, and that the hierarchy of fragments muTL(r) is infinite. The proof utilizes a translation of muTL into certain restricted alternating Buechi automata. As a corollary to the construction, it follows that every formula in muTL is equivalent to a formula without nested fixpoints, and thus muTL collapses to its one-variable fragment.
|28-03-06||Paul Taylor (Manchester):
Abstract Stone Duality and Real Analysis
We argue that Dedekind completeness and the Heine--Borel property should be seen as part of the ``algebraic'' structure of the real line, along with the usual arithmetic operations and relations. Dedekind cuts provide a uniform and natural way of formulating differentiation, integration and limits. They and these examples also generalise to intervals. Together with the arithmetic order, cuts enjoy proof-theoretic introduction and elimination rules similar to those for lambda abstraction and application. This system completely axiomatises computably continuous functions on the real line.
We show how this calculus (of ``single'' points) can be translated formally into Interval Analysis, interpreting the arithmetic operations a la Moore, and compactness as optimisation under constraints. Notice that interval computation is the conclusion and not the starting point.
This calculus for the real line is part of a more general recursive axiomatisation of general topology called Abstract Stone Duality.
|30-03-06||Peter Fritzson and Adrian Pop (Linköping):
PELAB - the Programming Environment Lab
The RML environment
Tools for generating efficient language implementations from Structured Operational Semantics, including tools for debugging and teaching specifications.
RML is a meta language for Structured Operational Semantics including Natural Semantics. It is strongly typed, polymorphic, deterministic, efficient, and is being used for writing language specifications ranging from small specifications of a few lines to large specifications of around 85 000 lines. It is being used successfully for a range of languages like Java, Modelica (equation-based), C subset with pointer arithmetic, MiniML, lambda calculus, etc. The language originates from research at PELAB in generating efficient compilers from specifications, see also LNCS #1549.
Extensive experience of writing large executable specifications using RML has shown that an efficient meta-language is not enough for practical usage. There is also a need to debug large specifications, to browse specifications for understanding, to teach specifications using teaching material that allows explanatory text combined with specifications that can be evaluated interactively. During the talk we will present the RML language and demonstrate tools for compiler/interpreter generation, an RML debugger, and ongoing work on an Eclipse plugin for specification writing in RML, as well as a WYSIWYG teaching tool for literate programming of specifications in active documents. (See also www.ida.liu.se/labs/pelab/rml)
|23-03-06||Anton Setzer: Inductive-Recursive
Defintions (Part II)|
(joint work with Peter Dybjer, Gothenburg)
|16-03-06||Anton Setzer: Inductive-Recursive
(joint work with Peter Dybjer, Gothenburg)
Inductive-recursive definitions were developed by P. Dybjer as a concepts which formalises the principle for introducing sets in Martin-Loef Type Theory. Indexed inductive-recursive definitions extend this principle by the possibility, to introduce simultaneously several sets inductive-recursively.
All sets definable in the core of Martin-Loef type theory, which is accepted by most researchers in this area, are instances of indexed inductive-recursive definitions. It contains as well many extensions, for instance Erik Palmgren's superuniverses and higher universes. Many data types which are used in programming and theorem proving are instances of indexed inductive-recursive definitions, e.g. the accessible part of an ordering, Martin-Loef's computability predicate, which was used in his normalisation proof for one version of Martin-Loef type theory, and Bove and Carpretta's formalisation of partial functions in type theory.
In this lecture we will introduce the notions of inductive-recursive definitions and indexed inductive-recursive definitions. We will show how to introduce inductive-recursive defintions using finitely many rules.
|23-02-06||Peter Mosses: Programming Language
From Scott and Strachey to Semantics Online
Since the middle of the last century, hundreds of programming languages have been designed and implemented - and new ones are continually emerging. The syntax of a programming language can usually be described quite precisely and efficiently using formal grammars. However, the formal description of its semantics is much more challenging. Language designers, implementers, and programmers commonly regard precise semantic descriptions as impractical and too costly. Research in semantics has allowed us to reason about software and has provided valuable insight into the design of programming languages, but few semantic descriptions of full languages have been published, and hardly any of these are currently available online.
One of the major approaches to formal semantics is denotational semantics, developed by Scott and Strachey in the late 1960s. Why has such a theoretically attractive approach been found impractical for describing full-scale programming languages? Does it help much to use monads in denotational descriptions, or is a more radical change needed? How might efficient online access to a repository of semantic descriptions be provided? Could it ever become as easy to generate efficient compilers and interpreters from semantic descriptions as it already is to generate parsers from grammars? This talk addresses such questions, and gives some grounds for optimism about the development of highly practical, online semantic descriptions.
(This is a preliminary version of a talk to be given as a BCS-FACS Evening Seminar in London next month.)
|16-02-06||Oliver Kullmann and Matthew Henderson:
A generative library for generalised SAT solving:
Software engineering, algorithms and mathematics
We give a short introduction what the OKlibrary can do (for the world and for you; in the future), and where we are now.
|09-02-06||Will Harwood: Ordinal Bounds for Dickson's Lemma
In a previous seminar it was claimed -- with handwaving but little actual mathematics -- that the ordinal bound on non-dominating sequences of n-tuples is $\omega^n$ (a tighter version of Dickson's Lemma, which simply states that any such sequence is finite). This Thursday's seminar will furnish a proof.
|talks in 2005||coordinated by Markus Michelbrink|
|1-5 July 2006||CiE 2006|
|Markus Michelbrink||A final coalgebra theorem in intensional type theory|
|A Bucholz derivation system for the ordinal analysis of KP+Pi3-reflection|
|Oliver Kullmann||Minimally unsatisfiable clause sets|
|Linear algebra on conflicts in clause sets|
|Toni Setzer||Inductive recursive definitions|
|Monika/Oliver/Uli||Selected topics in Ramsey Theory|
|coordination from May 2006||Markus Michelbrink|
|coordination until April 2006||Monika Seisenberger and Will Harwood|