University
Home  Department Home
 Research  Department
People  Prizes
 Industrial
Programme  Admissions
 Courses
281005  Norbert Preining (Università di Siena): Manyvalued logics  a short introduction and case study
The talk will start with a short introduction to the history and development of manyvalued logics, beginning from Aristoteles to recent developments. The second part of the talk will  as a casy study  discuss the relation of two nonclassical logics, the logic of linearly ordered Kripke frames and Gödel logics. The talk is intended for a broad audience, no special knowledge but basic mathematics and classical logics is necessary to follow the talk. 
101105  Roger Hindley: A New Old Typability Algorithm 
031105  Will Harwood: Approximant Collapse, Part II 
281005  Will Harwood: Approximant Collapse
Bisimilarity, a canonical notion of equivalence between processes, is defined coinductively, but may be approached, and even reached, by its inductively defined approximants. We take a simple process algebra, the Basic Parallel Processes, and investigate the level at which an approximation becomes the thing itself. 
080705  Workshop on BetterQuasiorderings (BQOs)
Speakers are

270605 
Joint PCV and SSRS seminar: 
300605  Joint PCV and SASS workshop: Presentations from Bergen University, Norway

240605  Monika Seisenberger and Ulrich Berger: Program extraction from proofs, Part II
We explain the logical foundations of optimized program extraction and demonstrate the actual extraction of the normalization program from Tait's proof in the Minlog system. The talk will be self contained. 
230605  Joint PCV and SSRS seminar: 
200605  Joint PCV and SSRS seminar: This talk was originally thought as a gentle introduction into category theory due to Jose Fiadeiros fourthcoming visit. However during my preparations two things became clear to me:

260505  Ulrich Berger: Program extraction from proofs
I present a nontrivial example of an efficient program extracted from a proof. The example is about strong normalization for the simply typed lambda calculus using Tait's computability predicates. The program machine extracted from the formal proof turns out to be `normalization by evaluation' which is known as a very elegant and efficient method for computing the normal form of a lambda term. In the talk I will also briefly explain the background of the method of program extraction from proofs and highlight some of the main problems and pitfalls in this area in general and in this case study in particular. This is joint work with Pierre Letouzey and Helmut Schwichtenberg (LMU Munich). 
110505  Oliver Kullmann: On a theoretical foundation of heuristics for branching, Part II 
060505  Oliver Kullmann: On a theoretical foundation of
heuristics for branching
I consider algorithms which have to solve a certain problem P, and they do it by creating new problems P_1, ..., P_k (of the same kind) out of P, solving the P_i, and combining the solutions for the P_1, ..., P_k into a solution for P. (This is like the divideandconquer strategy, but here the subproblems can have any size, and the typical run will need exponential time.) I assume that the combination step is cheap, so we ignore the cost for this step. What concerns us is the choice of the "branching" (P_1, ...,P_k), for which I assume a list of candidates is given (which all can have different lengths k). The theory I'm envisaging is a theory about this choice. A main stipulation is that of a "distance" d(P, P_i), a positive real number, which measures how much P_i is "easier" than P. One part of the theory is concerned with the choice of d. For my talk (perhaps 2 talks would be better) I want to assume that d is given. Then each possible branching (P_1, ..., P_k) yields a "branching tuple" (d(P, P_1), ..., d(P, P_k)), a kdimensional vector of positive real numbers. The next stipulation is that of a "projection" rho, which maps the branching tuples to positive real numbers, and the branching tuple with "lowest cost" (as measured by rho) will be considered as the best. The main question for my talk is the choice of rho. Given for example three branching tuples
I will present an axiomatic framework for projections, from which it follows that "in general" there is exactly one solution, one "generic projection". In the above example, this generic projection says, that number 3 is sligthly better than number 1, which is much better than number 2. 
150405  Markus Michelbrink: Interfaces as games, programs as strategies, Part II 
110405  Markus Michelbrink: Interfaces as games, programs as strategies
Peter Hancock and Anton Setzer developed the notion of interface to introduce interactive programming into dependent type theory. We generalise their notion and get an even simpler definition for interfaces. With this definition the relationship of interfaces to games becomes apparent. In fact from a game theoretical point of view interfaces are nothing than special games. Programs are strategies for these games. There is an obvious notion of refinement which coincides exactly with the intuition of what a refinement should do. Interfaces together with the refinement relation build a complete lattice in the sense that every family of interfaces has a least common refinement and there is an interface such that all members of this family refine this interface and this interface is an refinement for every other interface with this property. A program/strategy on a refinement of an interface gives a strategy on the interface. We can define several operators on interfaces: tensor, par, choice, bang et cetera. Every notion has a dual notion by interchanging the viewpoint of player and opponent. Lollipop is defined as usual in terms of tensor and negation. We define the notion of fair strategy on $A \multimap B$. A fair strategy is essentially a pair of a strategy and a proof that this strategy plays in both games eventually. By a slight restriction of the notion of interface we are able to define a composition for fair strategies on $A \multimap B$ and $B \multimap C$. The obtained strategy is again fair. We can define fair strategies on $A \multimap A$ and $A \otimes (A \multimap B) \multimap B$. These strategies are versions of copy cat strategies. Identifying strategies by some kind of behavioural equivalence we expect to receive a linear category. However we have not checked all details until now. All results so far can be achieved in intensional MartinLoef Type Theory and are verified in the theorem prover Agda. There are numerous links to other areas including refinement calculus, linear logic, game semantics, formal topology, process algebra and so on. However we have not explored all these relationships until now but intend to do so in the future. 
050405  Stefano Berardi: Semantic of Backtracking 
18032005  Anton Setzer: Ordinal systems, Part III 
11032005  Anton Setzer: Ordinal systems, Part II 
04032005  Anton Setzer: Ordinal systems
The development of ordinal notation system is the art of counting as far as possible. Ordinal notation systems can be used to measure the consistency strength of strong theories and to separate them. They allow to compare theories coming from completely different paradigms, which cannot be interpreted into each other. By showing the wellfoundedness of an ordinal notation system one exhausts the strength of a theory, using, sometimes for the first time, the full power available in a theory. Usually ordinal notation systems are presented in a very concrete way. This is a good approach for weak ordinal notation systems, say up to the ordinal $\epsilon_0$ (the strength of Peano Arithmetic) or $\Gamma_0$ (sometimes considered the limit of predicativity). It's not difficult to introduce ordinal notation systems of this strength, and many researchers are able to count that far. The area of ordinals beyond $\Gamma_0$ has however been only accessible to very few researchers. We believe that in order to make those stronger ordinal notation systems accessible to a wider audience, it is necessary to develop notations which reveal more of the underlying structure. In this talk we present an approach for the development of ordinal notation systems up to the strength of $ID_1$, the theory of one inductive definition, which is far beyond $\Gamma_0$. Our presentation will give the ordinal notation systems more structure. There will be two orderings: one the usual one, and one which corresponds to the order in which elements are introduced. There is as well a function which gives the components of an ordinal notation, and some conditions which need to be fulfilled. Then using the fact that if a set is wellordered then the set of pairs in it and the set of descending sequences in it are wellordered, we will be able to set up a machinery, which generates the ordinals of this system in order and therefore shows that it is wellordered in a hopefully intuitive way. In a continuation of this talk, we will count further, up to one recursively Mahlo ordinal, which reaches the limit of currently published predicatively acceptable extensions of MartinL{\"o}f type theory. 
01032005  Arnold Beckmann: Proofs ARE programms, Part II 
23022005  Arnold Beckmann: Proofs ARE programms
In this talk I will briefly discuss how proofs in certain subtheories of Peano Arithmetic (denoted Bounded Arithmetic) can be viewed as programs. One concern will be the complexity of the resulting program, which will correspond to the strength of the theory (we will deal with algorithms/theories related to the complexity class P) 
03022005  Initial meeting
We discussed the form of the seminar and agreed that it should be a high level research seminar where we can directly pop in to our research to present and discuss our latest ideas. We also fixed a provisional title for this seminar: Proofs, Complexity and Verification. Participants have been: Anton Setzer, Monika Seisenberger, Markus Roggenbach, Peter Mosses, Faron Moller, Markus Michelbrink, Oliver Kullmann, Magne Haveraaen, Ulrich Berger, Arnold Beckmann 
Markus Michelbrink: Equality in MartinLöf Type Theory 
coordination  Markus Michelbrink 
webpage  Arnold Beckmann 