Dave A. D. Tompkins
University of British Columbia
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Dave A. D. Tompkins.
principles and practice of constraint programming | 2002
Frank Hutter; Dave A. D. Tompkins; Holger H. Hoos
In this paper, we study the approach of dynamic local search for the SAT problem. We focus on the recent and promising Exponentiated Sub-Gradient (ESG) algorithm, and examine the factors determining the time complexity of its search steps. Basedon the insights gained from our analysis, we developed Scaling and Probabilistic Smoothing (SAPS), an efficient SAT algorithm that is conceptually closely related to ESG. We also introduce a reactive version of SAPS (RSAPS) that adaptively tunes one of the algorithms important parameters. We show that for a broadra nge of standard benchmark problems for SAT, SAPS andR SAPS achieve significantly better performance than both ESG and the state-of-the-art WalkSAT variant, Novelty+.
theory and applications of satisfiability testing | 2004
Dave A. D. Tompkins; Holger H. Hoos
In this paper we introduce UBCSAT, a new implementation and experimentation environment for Stochastic Local Search (SLS) algorithms for SAT and MAX-SAT. Based on a novel triggered procedure architecture, UBCSAT provides implementations of numerous well-known and widely used SLS algorithms for SAT and MAX-SAT, including GSAT, WalkSAT, and SAPS; these implementations generally match or exceed the efficiency of the respective original reference implementations. Through numerous reporting and statistical features, including the measurement of run-time distributions, UBCSAT facilitates the advanced empirical analysis of these algorithms. New algorithm variants, SLS algorithms, and reporting features can be added to UBCSAT in a straightforward and efficient way. UBCSAT is implemented in C and runs on numerous platforms and operating systems; it is publicly and freely available at www.satlib.org/ubcsat.
international conference on image processing | 1999
Dave A. D. Tompkins; Faouzi Kossentini
The emerging JBIG2 standard allows compliant encoders to achieve very high compression rates on bi-level images, especially when images are properly segmented into regions of line-art, halftones and text. We propose a fast method that is very effective at separating text from non-text regions, even when the regions are nonrectangular or have skew. Our method can also detect regions of reverse-coloured text. In most cases, our method increases the compression performance of the encoder. More importantly, our method can improve encoding speeds considerably, often by an order of magnitude.
theory and applications of satisfiability testing | 2010
Dave A. D. Tompkins; Holger H. Hoos
We introduce a new conceptual model for representing and designing Stochastic Local Search (SLS) algorithms for the propositional satisfiability problem (SAT). Our model can be seen as a generalization of existing variable weighting, scoring and selection schemes; it is based upon the concept of Variable Expressions (VEs), which use properties of variables in dynamic scoring functions. Algorithms in our model are constructed from conceptually separated components: variable filters, scoring functions (VEs), variable selection mechanisms and algorithm controllers. To explore the potential of our model we introduce the Design Architecture for Variable Expressions (DAVE), a software framework that allows users to specify arbitrarily complex algorithms at run-time. Using DAVE, we can easily specify rich design spaces of SLS algorithms and subsequently explore these using an automated algorithm configuration tool. We demonstrate that by following this approach, we can achieve significant improvements over previous state-of-the-art SLS-based SAT solvers on software verification benchmark instances from the literature.
canadian conference on artificial intelligence | 2003
Dave A. D. Tompkins; Holger H. Hoos
In this paper, we study the behaviour of the Scaling and Probabilistic Smoothing (SAPS) dynamic local search algorithm on the unweighted MAX-SAT problem. MAX-SAT is a conceptually simple combinatorial problem of substantial theoretical and practical interest; many application-relevant problems, including scheduling problems or most probable explanation finding in Bayes nets, can be encoded and solved as MAX-SAT. This paper is a natural extension of our previous work, where we introduced SAPS, and demonstrated that it is amongst the state-of-the-art local search algorithms for solvable SAT problem instances. We present results showing that SAPS is also very effective at finding optimal solutions for unsatisfiable MAX-SAT instances, and in many cases performs better than state-of-the-art MAX-SAT algorithms, such as the Guided Local Search algorithm by Mills and Tsang [8]. With the exception of some configuration parameters, we found that SAPS did not require any changes to efficiently solve unweighted MAX-SAT instances. For solving weighted MAX-SAT instances, a modified SAPS algorithm will be necessary, and we provide some thoughts on this topic of future research.
international conference on image processing | 1999
Magesh Valliappan; Brian L. Evans; Dave A. D. Tompkins; Faouzi Kossentini
The JBIG2 standard supports lossless and lossy coding models for text, halftone, and generic regions in bi-level images. For the JBIG2 lossy halftone compression mode, halftones are descreened before encoding. Previous JBIG2 descreening implementations produce high-quality images for clustered dot halftones at high compression rates but significantly degrade the image quality for stochastic halftones, even at much lower rates. In this paper, we develop (1) a flexible, computationally efficient, JBIG2-compliant method for compressing stochastic halftones that reduces noise, artifacts, and blurring; (2) quality measures for linear and nonlinear distortion in compressed halftones; and (3) rate-distortion tradeoffs for the encoder parameters.
canadian conference on artificial intelligence | 2006
Dave A. D. Tompkins; Holger H. Hoos
Stochastic local search (SLS) methods are underlying some of the best-performing algorithms for certain types of SAT instances, both from an empirical as well as from a theoretical point of view. By definition and in practice, random decisions are an essential ingredient of SLS algorithms. In this paper we empirically analyse the role of randomness in these algorithms. We first study the effect of the quality of the underlying random number sequence on the behaviour of well-known algorithms such as Papadimitrious algorithm and Adaptive Novelty+. Our results indicate that while extremely poor quality random number sequences can have a detrimental effect on the behaviour of these algorithms, there is no evidence that the use of standard pseudo-random number generators is problematic. We also investigate the amount of randomness required to achieve the typical behaviour of these algorithms using derandomisation. Our experimental results indicate that the performance of SLS algorithms for SAT is surprisingly robust with respect to the number of random decisions made by an algorithm.
learning and intelligent optimization | 2014
Sam Bayless; Dave A. D. Tompkins; Holger H. Hoos
The propositional satisfiability problem (SAT) is one of the most prominent and widely studied NP-hard problems. The development of SAT solvers, whether it is carried out manually or through the use of automated design tools such as algorithm configurators, depends substantially on the sets of benchmark instances used for performance evaluation. Since the supply of instances from real-world applications of SAT is limited, and artificial instance distributions such as Uniform Random \(k\)-SAT are known to have markedly different structure, there has been a long-standing interest in instance generators capable of producing ‘realistic’ SAT instances that could be used during development as proxies for real-world instances. However, it is not obvious how to assess the quality of the instances produced by any such generator. We propose a new approach for evaluating the usefulness of an arbitrary set of instances for use as proxies during solver development, and introduce a new metric, \(Q\)-score, to quantify this. We apply our approach on several artificially generated and real-world benchmark sets and quantitatively compare their usefulness for developing competitive SAT solvers.
data compression conference | 1999
Dave A. D. Tompkins; Faouzi Kossentini
Summary form only given. The Joint Bi-Level Expert Group (JBIG), an international study group affiliated with the ISO/IEC and ITU-T, has recently completed a committee draft of the JBIG2 standard for lossy and lossless bi-level image compression. We study design considerations for a purely lossless encoder. First, we outline the JBIG2 bitstream, focusing on the options and parameters available to an encoder. Then, we present numerous lossless encoder design strategies, including lossy to lossless coding approaches. For each strategy, we determine the compression performance, and the execution times for both encoding and decoding. The strategy that achieved the highest compression performance in our experiment used a double dictionary approach, with a residue cleanup. In this strategy, small and unique symbols were coded as a generic region residue. Only repeated symbols or those used as a basis for soft matches were added to a dictionary, with the remaining symbols embedded as refinements in the symbol region segment. The second dictionary was encoded as a refinement-aggregate dictionary, where dictionary symbols were encoded as refinements of symbols from the first dictionary, or previous entries in the second dictionary. With all other bitstream parameters optimized, this strategy can easily achieve an additional 30% compression over simpler symbol dictionary approaches. Next, we continue the experiment with an evaluation of each of the bitstream options and configuration parameters, and their impact on complexity and compression. We also demonstrate the consequences of choosing incorrect parameters. We conclude with a summary of our compression results, and general recommendations for encoder designers.
theory and applications of satisfiability testing | 2011
Dave A. D. Tompkins; Adrian Balint; Holger H. Hoos