Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Herbert Wiklicky is active.

Publication


Featured researches published by Herbert Wiklicky.


ieee computer security foundations symposium | 2002

Approximate non-interference

A. Di Pierro; Chris Hankin; Herbert Wiklicky

We address the problem of characterising the security of a program against unauthorised information flows. Classical approaches are based on non-interference models which depend ultimately on the notion of process equivalence. In these models confidentiality is an absolute property stating the absence of any illegal information flow. We present a model in which the notion of non-interference is approximated in the sense that it allows for some exactly quantified leakage of information. This is characterised via a notion of process similarity which replaces the indistinguishability of processes by a quantitative measure of their behavioural difference. Such a quantity is related to the number of statistical tests needed to distinguish two behaviours. We also present two semantics-based analyses of approximate noninterference and we show that one is a correct abstraction of the other.


Journal of Logic and Computation | 2005

Probabilistic λ-calculus and Quantitative Program Analysis

Alessandra Di Pierro; Chris Hankin; Herbert Wiklicky

We show how the framework of probabilistic abstract interpretation can be applied to statically analyse a probabilistic version of the λ-calculus. The resulting analysis allows for a more speculative use of its outcomes based on the consideration of statistically defined quantities. After introducing a linear operator based semantics for our probabilistic λ-calculus Λp, and reviewing the framework of abstract interpretation and strictness analysis, we demonstrate our technique by constructing a probabilistic (first-order) strictness analysis for Λp.


international conference on concurrency theory | 2003

Quantitative relations and approximate process equivalences

Alessandra Di Pierro; Chris Hankin; Herbert Wiklicky

We introduce a characterisation of probabilistic transition systems (PTS) in terms of linear operators on some suitably defined vector space representing the set of states. Various notions of process equivalences can then be re-formulated as abstract linear operators related to the concrete PTS semantics via a probabilistic abstract interpretation. These process equivalences can be turned into corresponding approximate notions by identifying processes whose abstract operators “differ” by a given quantity, which can be calculated as the norm of the difference operator. We argue that this number can be given a statistical interpretation in terms of the tests needed to distinguish two behaviours.


Theoretical Computer Science | 2005

Measuring the confinement of probabilistic systems

Alessandra Di Pierro; Chris Hankin; Herbert Wiklicky

In this paper we lay the semantic basis for a quantitative security analysis of probabilistic systems by introducing notions of approximate confinement based on various process equivalences. We recast the operational semantics classically expressed via probabilistic transition systems (PTS) in terms of linear operators and we present a technique for defining approximate semantics as probabilistic abstract interpretations of the PTS semantics. An operator norm is then used to quantify this approximation. This provides a quantitative measure e of the indistinguishability of two processes and therefore of their confinement. In this security setting a statistical interpretation is then given of the quantity e which relates it to the number of tests needed to breach the security of the system.


workshop on functional and constraint logic programming | 2002

Probabilistic Constraint Handling Rules

Thom W. Frühwirth; Alessandra Di Pierro; Herbert Wiklicky

Abstract Classical Constraint Handling Rules (CHR) provide a powerful tool for specifying and implementing constraint solvers and programs. The rules of CHR rewrite constraints (non-deterministically) into simpler ones until they are solved. In this paper we introduce an extension of Constraint Handling Rules (CHR), namely Probabilistic CHRs (PCHR). These allow the probabilistic “weighting” of rules, specifying the probability of their application. In this way we are able to formalise various randomised algorithms such as for example Simulated Annealing. The implementation is based on source-to-source transformation (STS). Using a recently developed prototype for STS for CHR, we could implement probabilistic CHR in a concise way with a few lines of code in less than one hour.


international conference on information and communication security | 2008

Quantifying Timing Leaks and Cost Optimisation

Alessandra Di Pierro; Chris Hankin; Herbert Wiklicky

We develop a new notion of security against timing attacks where the attacker is able to simultaneously observe the execution time of a program and the probability of the values of low variables. We then show how to measure the security of a program with respect to this notion via a computable estimate of the timing leakage and use this estimate for cost optimisation.


Journal of Functional Programming | 2005

Quantitative static analysis of distributed systems

Alessandra Di Pierro; Chris Hankin; Herbert Wiklicky

We introduce a quantitative approach to the analysis of distributed systems which relies on a linear operator based network semantics. A typical problem in a distributed setting is how information propagates through a network, and a typical qualitative analysis is concerned with establishing whether some information will eventually be transmitted from one node to another node in the network. The quantitative approach we present allows us to obtain additional information such as an estimation of the probability that some data is transmitted within a given interval of time. We formalise situations like this using a probabilistic version of a process calculus which is the core of KLAIM, a language for distributed and mobile computing based on interactions through distributed tuple spaces. The analysis we present exploits techniques based on Probabilistic Abstract Interpretation and is characterised by compositional aspects which greatly simplify the inspection of the nodes interaction and the detection of the information propagation through a computer network.


mathematical foundations of computer science | 1998

Probabilistic Concurrent Constraint Programming: Towards a Fully Abstract Model

Alessandra Di Pierro; Herbert Wiklicky

This paper presents a Banach space based approach towards a denotational semantics of a probabilistic constraint programming language. This language is based on the concurrent constraint programming paradigm, where randomness is introduced by means of a probabilistic choice construct. As a result, we obtain a declarative framework, in which randomised algorithms can be expressed and formalised. The denotational model we present is constructed by using functional-analytical techniques. As an example, the existence of fixed-points is guaranteed by the Brouwer-Schauder Fixed-Point Theorem. A concrete fixed-point construction is also presented which corresponds to a notion of observables capturing the exact results of both finite and infinite computations.


Electronic Notes in Theoretical Computer Science | 2005

Continuous-Time Probabilistic KLAIM

Alessandra Di Pierro; Chris Hankin; Herbert Wiklicky

The design of languages supporting network programming is a necessary step towards the formalisation of distributed and mobile computing. The existence of an abstract semantic framework constitutes the basis for a formal analysis of such systems. The KLAIM paradigm [5] provides such a semantic framework by introducing basic concepts and primitives addressing the key aspects of the coordination of interacting located processes. We extend this basic paradigm with probabilistic constructs with the aim of introducing a semantic basis for a quantitative analysis of networks. A quantitative analysis allows in general for the consideration of more “realistic” situations. For example, a probabilistic analysis allows for establishing the security of a system up to a given tolerance factor expressing how much the system is actually vulnerable. This is in contrast to a qualitative analysis which typically might be used to validate the absolute security of a given system. In a distributed environment quantitative analysis is also of a great practical use in the consideration of timing issues which involve the asynchronous communications among processes running with different clocks. In a security setting these issues are relevant e.g. for the analysis and prevention of denial of service attacks, which involve the delaying of time-critical operations [9]. In our probabilistic version of KLAIM, which we call pKLAIM, we introduce probabilities in a number of ways. At the local level, we introduce probabilistic parallel and choice operators. In addition we use probabilistic


International School on Foundations of Security Analysis and Design | 2001

Two Formal Approaches for Approximating Noninterference Properties

Alessandro Aldini; Mario Bravetti; Alessandra Di Pierro; Roberto Gorrieri; Chris Hankin; Herbert Wiklicky

The formalisation of security properties for computer systems raises the problem of overcoming also in a formal setting the classical view according to which confidentiality is an absolute property stating the complete absence of any unauthorised disclosure of information. In this paper, we present two formal models in which the notion of noninterference, which is at the basis of a large variety of security properties defined in the recent literature, is approximated. To this aim, the definition of indistinguishability of process behaviour is replaced by a similarity notion, which introduces a quantitative measure e of the behavioural difference among processes. The first model relies on a programming paradigm called Probabilistic Concurrent Constraint Programming, while the second one is presented in the setting of a probabilistic process algebra. In both models, appropriate notions of distance provide information (the e) on the security level of the system at hand, in terms of the capability of an external observer of identifying illegal interferences.

Collaboration


Dive into the Herbert Wiklicky's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chris Hankin

Imperial College London

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mieke Massink

National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Erik P. de Vink

Eindhoven University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge