Lars Nolle
Nottingham Trent University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Lars Nolle.
Advances in Engineering Software | 2005
Lars Nolle; Ivan Zelinka; Adrian A. Hopgood; Alec Goodyear
In this article, the performance of a self-organizing migration algorithm (SOMA), a new stochastic optimization algorithm, has been compared with simulated annealing (SA) and differential evolution (DE) for an engineering application. This application is the automated deduction of 14 Fourier terms in a radio-frequency (RF) waveform to tune a Langmuir probe. Langmuir probes are diagnostic tools used to determine the ion density and the electron energy distribution in plasma processes. RF plasmas are inherently non-linear, and many harmonics of the driving fundamental can be generated in the plasma. RF components across the ion sheath formed around the probe distort the measurements made. To improve the quality of the measurements, these RF components can be removed by an active-compensation method. In this research, this was achieved by applying an RF signal to the probe tip that matches both the phase and amplitude of the RF signal generated from the plasma. Here, seven harmonics are used to generate the waveform applied to the probe tip. Therefore, 14 mutually interacting parameters (seven phases and seven amplitudes) had to be tuned on-line. In previous work SA and DE were applied successfully to this problem, and hence were chosen to be compared with the performance of SOMA. In this application domain, SOMA was found to outperform SA and DE.
international conference on pattern recognition | 2010
Chaw Chia; Nasser Sherkat; Lars Nolle
Owing to effectiveness and ease of implementation Sum rule has been widely applied in the biometric research field. Different matcher information has been used as weighting parameters in the weighted Sum rule. In this work, a new parameter has been devised in reducing the genuine/imposter distribution overlap. It is shown that the overlap region width has the best generalization performance as the weighting parameter amongst other commonly used matcher information. Furthermore, it is illustrated that the equal weighted Sum rule can generally perform better than the Equal Error Rate and d-prime weighted Sum rule. The publicly available databases: the NIST-BSSR1 multimodal biometric and Xm2vts score sets have been used.
Knowledge Based Systems | 2007
Lars Nolle
The quality of steel strip produced in a wide strip rolling mill depends heavily on the careful selection of initial ground work roll profiles for each of the mill stands in the finishing train. In the past, these profiles were determined by human experts, based on their knowledge and experience. In this research, SASS, a novel heuristic optimisation algorithm that has only one control parameter, has been used to find the optimum profiles for a simulated rolling mill. The best set of profiles found by SASS clearly outperformed the original set and performed equally well as conventional algorithms without the need of finding a suitable set of control parameters.
Knowledge Based Systems | 2008
Lars Nolle
In the oceanic context, the aim of Target Motion Analysis (TMA) is to estimate the state, i.e. location, bearing and velocity, of a sound-emitting object. These estimates are based on a series of passive measures of both the angle and the distance between an observer and the source of sound, which is called the target. These measurements are corrupted by noise and false readings, which are perceived as outliers. Usually, sequences of measurements are taken and statistical methods, like the Least Squares method or the Annealing M-Estimator, are applied to estimate the targets state by minimising the residual in range and bearing for a series of measurements. In this project, an ACO-Estimator, a novel hybrid optimisation algorithm based on Ant Colony Optimisation, has been developed and applied to the TMA problem and its effectiveness was compared with standard estimators. It was shown that the new algorithm outperforms conventional estimators by successfully removing outliers from the measurements.
International Conference on Innovative Techniques and Applications of Artificial Intelligence | 2007
Lars Nolle
In the oceanic context, the aim of Target Motion Analysis (TMA) is to estimate the state, i.e. location, bearing and velocity, of a sound-emitting object. These estimates are based on a series of passive measures of both the angle and the distance between an observer and the source of sound, which is called the target. These measurements are corrupted by noise and false readings, which are perceived as outliers.
Plasma Sources Science and Technology | 2004
Jafar Al-Kuzee; T Matsuura; Alec Goodyear; Lars Nolle; Adrian A. Hopgood; Philip Picton; N St J Braithwaite
This paper presents several approaches that have been used to control, optimize and characterize a low pressure (10–300 mTorr) plasma processing system. Methods such as contour following and differential evolution have been used to find contours of DC bias, total ion flux, ion energy flux, quadrupole mass spectrum (QMS) intensity ratios and line intensity ratios of the optical emission spectrum (OES) in argon and nitrogen plasmas. A mapping for a 4 × 4 multi-dimensional parameter space is also presented, in which the relationship between four control parameters (power, pressure, mass flow rates of two supplied gases) and four measurement outputs (DC bias, ion flux, QMS ratios and OES line intensity ratios) is determined in a plasma etching process. The use of these methods significantly reduces the time needed to re-configure the processing system and will benefit transfer of processes between different systems. A similar approach has also been used to find quickly an optimum condition for directional etching of a silicon wafer.
Engineering Optimization | 2007
Lars Nolle; Gerald Schaefer
Often in engineering systems, full-colour images have to be displayed on limited hardware, for example on mobile devices or embedded systems that can only handle a limited number of colours. Therefore an image is converted into an indexed map from where the indices point to specific colours in a fixed-size colour map generated for that image. The choice of an optimal colour map, or palette, is therefore crucial as it directly determines the quality of the resulting image. Typically, standard quantization algorithms are used to create colour maps. Whereas these algorithms employ domain specific knowledge, in this work a variant of simulated annealing (SA) was employed as a standard black-box optimization algorithm for colour map generation. The main advantage of black-box optimization algorithms is that they do not require any domain specific knowledge yet are able to provide a near optimal solution. The effectiveness of the approach is evaluated by comparing its performance with several specialized colour quantization algorithms. The results obtained show that even without any domain specific knowledge the SA based algorithm is able to outperform standard quantization algorithms. To further improve the performance of the algorithm the SA technique was combined with a standard k-means clustering technique. This hybrid quantization algorithm is shown to outperform all other algorithms and hence to provide images with superior image quality.
systems, man and cybernetics | 2006
Tarek A. El-Mihoub; Lars Nolle; Gerald Schaefer; Tomoharu Nakashima; Adrian A. Hopgood
Color palettes are inherent to color quantized images and represent the range of possible colors in such images. When converting full true color images to palletized counterparts, the color palette should be chosen so as to minimize the resulting distortion compared to the original. In this paper, we show that in contrast to previous approaches on color quantization, which rely on either heuristics or clustering techniques, a generic optimization algorithm such as a self-adaptive hybrid genetic algorithm can be employed to generate a palette of high quality. Experiments on a set of standard test images using a novel self-adaptive hybrid genetic algorithm show that this approach is capable of outperforming several conventional color quantization algorithms and provide superior image quality.
Expert Systems | 2014
Tariq Tashan; Tony Allen; Lars Nolle
This paper presents a multi-level speaker verification system that uses 64 discrete Fourier transform spectrum components as input feature vectors. A speech activity detection technique is used as a pre-processing stage to identify vowel phoneme boundaries within a speech sample. A modified self-organising map SOM is then used to filter the speech data by using cluster information extracted from three vowels for a claimed speaker. This SOM filtering stage also provides coarse speaker verification. Finally, a second speaker verification level of three multi-layer perceptron networks classifies the filtered frames provided by the SOMs. These multi-layer perceptrons work as fine-grained vowel-based speaker verifiers. The proposed verification algorithm shows a performance of 94.54% when evaluated using 50 speakers from the Centre for Spoken Language Understanding speaker verification database. In addition, it is shown that the novel discrete Fourier transform spectrum-based linear correlation pre-processing technique, presented here, provides the system with greater robustness against changes in speech volume levels when compared with an equivalent energy frame analysis.
International Conference on Innovative Techniques and Applications of Artificial Intelligence | 2012
Giovanna Martínez-Arellano; Lars Nolle; John A. Bland
Numerical weather prediction models can produce wind speed forecasts at a very high space resolution. However, running these models with that amount of precision is time and resource consuming. In this paper, the integration of the Weather Research and Forecasting – Advanced Research WRF (WRF-ARW) mesoscale model with four different downscaling approaches is presented. Three of the proposed methods are mathematical based approaches that need a predefined model to be applied. The fourth approach, based on genetic programming (GP), will implicitly find the optimal model to downscale WRF forecasts, so no previous assumptions about the model need to be made. WRFARW forecasts and observations at three different sites of the state of Illinois in the USA are analysed before and after applying the downscaling techniques. Results have shown that GP is able to successfully downscale the wind speed predictions, reducing significantly the inherent error of the numerical models.