Nic Wilson
University College Cork
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Nic Wilson.
Journal of Artificial Intelligence Research | 2008
Judy Goldsmith; Jérôme Lang; Miroslaw Truszczynski; Nic Wilson
We investigate the computational complexity of testing dominance and consistency in CP-nets. Up until now, the complexity of dominance has been determined only for restricted classes in which the dependency graph of the CP-net is acyclic. However, there are preferences of interest that define cyclic dependency graphs; these are modeled with general CP-nets. We show here that both dominance and consistency testing for general CP-nets are PSPACE-complete. The reductions used in the proofs are from STRIPS planning, and thus establish strong connections between both areas.
Journal of Artificial Intelligence Research | 2007
J. Christopher Beck; Nic Wilson
Most classical scheduling formulations assume a fixed and known duration for each activity. In this paper, we weaken this assumption, requiring instead that each duration can be represented by an independent random variable with a known mean and variance. The best solutions are ones which have a high probability of achieving a good makespan. We first create a theoretical framework, formally showing how Monte Carlo simulation can be combined with deterministic scheduling algorithms to solve this problem. We propose an associated deterministic scheduling problem whose solution is proved, under certain conditions, to be a lower bound for the probabilistic problem. We then propose and investigate a number of techniques for solving such problems based on combinations of Monte Carlo simulation, solutions to the associated deterministic problem, and either constraint programming or tabu search. Our empirical results demonstrate that a combination of the use of the associated deterministic problem and Monte Carlo simulation results in algorithms that scale best both in terms of problem size and uncertainty. Further experiments point to the correlation between the quality of the deterministic solution and the quality of the probabilistic solution as a major factor responsible for this success.
Archive | 2000
Nic Wilson
The method of reasoning with uncertain information known as Dempster-Shafer theory arose from the reinterpretation and development of work of Arthur Dempster [Dempster, 1967; 1968] by Glenn Shafer in his book a mathematical theory of evidence [Shafer, 1976], and further publications e.g., [Shafer, 1981; 1990]. More recent variants of Dempster-Shafer theory include the Transferable Belief Model see e.g., [Smets, 1988; Smets and Keimes, 1994] and the Theory of Hints e.g., [Kohlas and Monney, 1995].
Artificial Intelligence | 2008
Juerg Kohlas; Nic Wilson
Local computation in join trees or acyclic hypertrees has been shown to be linked to a particular algebraic structure, called valuation algebra. There are many models of this algebraic structure ranging from probability theory to numerical analysis, relational databases and various classical and non-classical logics. It turns out that many interesting models of valuation algebras may be derived from semiring valued mappings. In this paper we study how valuation algebras are induced by semirings and how the structure of the valuation algebra is related to the algebraic structure of the semiring. In particular, c-semirings with idempotent multiplication induce idempotent valuation algebras and therefore permit particularly efficient architectures for local computation. Also important are semirings whose multiplicative semigroup is embedded in a union of groups. They induce valuation algebras with a partially defined division. For these valuation algebras, the well-known architectures for Bayesian networks apply. We also extend the general computational framework to allow derivation of bounds and approximations, for when exact computation is not feasible.
uncertainty in artificial intelligence | 1991
Nic Wilson
A very computationally-efficient Monte-Carlo algorithm for the calculation of Dempster-Shafer belief is described. If Bel is the combination using Dempsters Rule of belief functions Bel1,..., Belm, then, for subset b of the frame Θ, Bel(b) can he calculated in time linear in |Θ| and m (given that the weight of conflict is bounded). The algorithm can also be used to improve the complexity of the Shenoy-Shafer algorithms on Markov trees, and be generalised to calculate Dempster-Shafer Belief over other logics.
Artificial Intelligence | 2011
Nic Wilson
A simple logic of conditional preferences is defined, with a language that allows the compact representation of certain kinds of conditional preference statements, a semantics and a proof theory. CP-nets and TCP-nets can be mapped into this logic, and the semantics and proof theory generalise those of CP-nets and TCP-nets. The system can also express preferences of a lexicographic kind. The paper derives various sufficient conditions for a set of conditional preferences to be consistent, along with algorithmic techniques for checking such conditions and hence confirming consistency. These techniques can also be used for totally ordering outcomes in a way that is consistent with the set of preferences, and they are further developed to give an approach to the problem of constrained optimisation for conditional preferences.
Constraints - An International Journal | 2010
Eugene C. Freuder; Robert Heffernan; Richard J. Wallace; Nic Wilson
We describe a simple CSP formalism for handling multi-attribute preference problems with hard constraints, one that combines hard constraints and preferences so the two are easily distinguished conceptually and for purposes of problem solving. Preferences are represented as a lexicographic order over complete assignments based on variable importance and rankings of values in each domain. Feasibility constraints are treated in the usual manner. Since the preference representation is ordinal in character, these problems can be solved with algorithms that do not require evaluations to be represented explicitly. This includes ordinary CSP algorithms, although these cannot stop searching until all solutions have been checked, with the important exception of heuristics that follow the preference order (lexical variable and value ordering). We describe relations between lexicographic CSPs and more general soft constraint formalisms and show how a full lexicographic ordering can be expressed in the latter. We discuss relations with (T)CP-nets, highlighting the advantages of the present formulation, and we discuss the use of lexicographic ordering in multiobjective optimisation. We also consider strengths and limitations of this form of representation with respect to expressiveness and usability. We then show how the simple structure of lexicographic CSPs can support specialised algorithms: a branch and bound algorithm with an implicit cost function, and an iterative algorithm that obtains optimal values for successive variables in the importance ordering, both of which can be combined with appropriate variable ordering heuristics to improve performance. We show experimentally that with these procedures a variety of problems can be solved efficiently, including some for which the basic lexically ordered search is infeasible in practice.
Mathematical models for handling partial knowledge in artificial intelligence | 1995
Serafín Moral; Nic Wilson
The best understood and most highly developed theory of uncertainty is Bayesian probability. There is a large literature on its foundations and there are many different justifications of the theory; however, all of these assume that for any proposition a, the beliefs in a and ⌝a are strongly tied together. Without compelling justification, this assumption greatly restricts the type of information that can be satisfactorily represented, e.g., it makes it impossible to represent adequately partial information about an unknown chance distribution P such as 0.6 ≤ P(a) ≤ 0.8. The strict Bayesian requirement that an epistemic state be a single probability function seems unreasonable. A natural extension of the Bayesian theory is thus to allow sets of probability functions and to consider constraints and bounds on these, and to calculate supremum and infimum values of the probabilities of propositions (known as Upper and Lower Probabilities) given the constraints. Early work on this includes Boole1 and Good2 and early appearances in the Artificial Intelligence literature include Quinlan3 and Nilsson4.
european conference on symbolic and quantitative approaches to reasoning and uncertainty | 1999
Jerome Mengin; Nic Wilson
The Local Computation Framework has been used to improve the efficiency of computation in various uncertainty formalisms. This paper shows how it can be applied to logical deduction in first-order logic, modal and conditional logics, circumscription and possibilistic logic.
Annals of Mathematics and Artificial Intelligence | 2010
Mirco Gelain; Maria Silvia Pini; Francesca Rossi; Kristen Brent Venable; Nic Wilson
Constraints and quantitative preferences, or costs, are very useful for modelling many real-life problems. However, in many settings, it is difficult to specify precise preference values, and it is much more reasonable to allow for preference intervals. We define several notions of optimal solutions for such problems, providing algorithms to find optimal solutions and also to test whether a solution is optimal. Most of the time these algorithms just require the solution of soft constraint problems, which suggests that it may be possible to handle this form of uncertainty in soft constraints without significantly increasing the computational effort needed to reason with such problems. This is supported also by experimental results. We also identify classes of problems where the same results hold if users are allowed to use multiple disjoint intervals rather than a single one.