There is a collection of (trivial) movement-under-constraint puzzles which are sometimes collectively referred to as "Cabbage and Goat" problems; see for example Crossing the River (with a Wolf, a Goat, and a Cabbage) Taking these as motivational, I will recast some major IT challenges in terms of these kinds of problems; this talk will include: the telepathic teleporting goats problem (with and without invisibility); the goat replacement challenge; the goat grooming problem; and the little goat, big yak problem. Each of these represents an example of a major IT challenge, but greatly simplified, allowing greater clarity as to what each of the problems actually amounts to.
I will also present some patterns of goat interaction which present an independent model of a more general IT problem.
In the spirit of games, you are encouraged to bring your laptops along to play with each of these challenges - and there will be a small non-pecuniary brassica- or goat-related prize for those who correctly identify the underlying challenge in large IT systems. For those who get the problem wrong there may be a brassica-based forfeit - university regulations permitting.
Computer displays have come a long way from small CRT monitors to large touchscreen 4K curved panel monitors. Yet, the experience in using them has not improved a lot towards the realistic tangible level. Haptic feedback has remained akin to a keypress. In this talk we will see how to put tangible computing into regular objects to make future display and haptic technologies, in order to interact with the digital world. We will see how to manipulate fog, bubbles and levitated beads in mid-air, and bend tabletops and touchpads to visualise and touch 3D surfaces. We will discuss how these techniques might help us improve our experience in interacting with the digital world.
Well structured pushdown systems, which is an extension of well-structured transition system (WSTS). WSTS is an infinite state transition system where states are well-quasi-ordered. There are several decidable properties, e.g., coverability. We extended it with a stack. We have applied them into decidability of timed pushdown systems.
Finding correspondences between two surfaces is a fundamental operation in various applications in computer graphics and related fields. Candidate correspondences can be found by matching local signatures, but as they only consider local geometry, many are globally inconsistent. We provide a novel algorithm to prune a set of candidate correspondences to those most likely to be globally consistent. Our approach can handle articulated surfaces, and ones related by a deformation which is globally non-isometric, provided that the deformation is locally approximately isometric. Our approach uses an efficient diffusion framework, and only requires geodesic distance calculations in small neighbourhoods, unlike many existing techniques which require computation of global geodesic distances. We demonstrate that, for typical examples, our approach provides significant improvements in accuracy, yet also reduces time and memory costs by a factor of several hundred compared to existing pruning techniques. Our method is furthermore insensitive to holes, unlike many other methods.
In this talk I will present my research in computer vision and machine learning, and its two main applications to healthcare and astrophysics. Although these fields sound at first very different, we will see that they often share similar challenges - and solutions. Thus, from modelling the beating heart from MRIs, to monitoring the sources of solar wind at surface of the Sun, or from quantifying the abnormality of a patient's gait to characterising the morphology of galaxies, there is often only a short step.
Researchers must navigate big data. Current scientific knowledge includes 50 million published articles. How can a system help a researcher find relevant documents in their field? We introduce SciNet, an interactive search system that anticipates the user’s search intents by estimating them from the user's interaction with the interface. The estimated intents are visualized on an intent radar, a radial layout that organizes potential intents as directions in the information space. The system assists users to direct their search by allowing feedback to be targeted on keywords representing the potential intents. Users can provide feedback by moving the keywords on the intent radar. The system then learns and visualizes improved estimates and corresponding documents. The resulting user models are explicit open user models curated by the user during the interactive information seeking. SciNet has been shown to significantly improve users' task performance and the quality of retrieved information without compromising task execution time. We also show how user models learned in SciNet can be used to help cold-start recommendation in another system, the CoMeT talk management system, by cross-system user model transfer across the systems.
Jaakko Peltonen is an Associate Professor of statistics (data analysis) at the School of Information Sciences, University of Tampere, Finland where he leads the Statistical Machine Learning and Exploratory Data Analysis group. He is an associate editor of Neural Processing Letters, an editorial board member of Heliyon, and member of the European Neural Network Society executive committee. He has served in organizing committees of seven international conferences and one international summer school, has served in program committees of 31 international conferences/workshops and has performed referee duties for numerous international journals and conferences. He is an expert in statistical machine learning methods for exploratory data analysis, nonlinear dimensionality reduction for visualization of data, and learning from multiple sources.
The talk will review a number of parallel algorithms and concurrent data structures in the LTSmin toolset for high-performance model checking. The most important underlying data structure is a concurrent hash table, made lockless for maximal scalability. On top of that we implemented parallel Nested Depth-First-Search for LTL model checking, and parallel operations on Binary Decision Diagrams for symbolic model checking. Recently, we investigated parallel decomposition into Strongly Connected Components, to facilitate LTL model checking based on Generalised Buechi and Rabin automata.
Prof van de Pol is interested in model checking, theorem proving and testing techniques for the analysis of safety, dependability and security aspects of software-intensive embedded systems.
His two main lines of research are symbolic methods and high-performance computing:
Symbolic methods: These greatly extend the scope of application of model checking. With his PhD students he developed the first abstraction method for process algebra with data on transitions. Other successes were the application of confluence and static analysis to obtain enormous state space reductions. In the last 5 years, these techniques were extended to probabilistic (Markovian) and timed systems.
High-performance computing: With his group, he investigates new parallel graph algorithms, to generate and analyse astronomic state spaces. This resulted in LTSmin, the world-leading tool in high-performance verification. This led to numerous publications in top conferences (CAV, TACAS, FM, CAD, ATVA). He and his group won several first prizes in the RERS and MCC competitions. He played a major role in organising the PDMC series on parallel and distributed model checking.
Together with industrial and scientific partners, he analyses their complex systems. Examples are the challenging safety analysis for ERTMS, the new European Railway Traffic Management System (FP7 INESS). He also applied verification to the control software in the Large Hadron Collider (CERN, with TU/e).