Aistis Raudys
Vilnius University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Aistis Raudys.
intelligent information systems | 2007
Zidrina Pabarskaite; Aistis Raudys
This paper presents a comprehensive survey of web log/usage mining based on over 100 research papers. This is the first survey dedicated exclusively to web log/usage mining. The paper identifies several web log mining sub-topics including specific ones such as data cleaning, user and session identification. Each sub-topic is explained, weaknesses and strong points are discussed and possible solutions are presented. The paper describes examples of web log mining and lists some major web log mining software packages.
IEEE Transactions on Pattern Analysis and Machine Intelligence | 2010
Sarunas Raudys; Aistis Raudys
A novel loss function to train a net of K single-layer perceptrons (KSLPs) is suggested, where pairwise misclassification cost matrix can be incorporated directly. The complexity of the network remains the same; a gradients computation of the loss function does not necessitate additional calculations. Minimization of the loss requires a smaller number of training epochs. Efficacy of cost-sensitive methods depends on the cost matrix, the overlap of the pattern classes, and sample sizes. Experiments with real-world pattern recognition (PR) tasks show that employment of novel loss function usually outperforms three benchmark methods.
ieee conference on computational intelligence for financial engineering economics | 2012
Sarunas Raudys; Aistis Raudys
To improve portfolio management process we suggest using profit histories of automated trading strategies instead of actual assets. Such history can be generated by simulating hundreds of automated trading strategies (robots). We developed three-level decision making system aimed to find the portfolio weights. At the first level, virtual robots trade the assets, at the second level we create virtual profit fusion agents that calculate weighted sums of the profit series created by the first level robots. At the third level, we rank the fusion agents, select a set of the best ones and construct the final portfolio. Experiments with real financial 2004-2011 years data confirm usefulness of the novel approach.
Technological and Economic Development of Economy | 2014
Šarūnas Raudys; Aistis Raudys; Židrina Pabarškaitė
AbstractTo understand large-scale portfolio construction tasks we analyse sustainable economy problems by splitting up large tasks into smaller ones and offer an evolutional feed-forward system-based approach. The theoretical justification for our solution is based on multivariate statistical analysis of multidimensional investment tasks, particularly on relations between data size, algorithm complexity and portfolio efficacy. To reduce the dimensionality/sample size problem, a larger task is broken down into smaller parts by means of item similarity – clustering. Similar problems are given to smaller groups to solve. Groups, however, vary in many aspects. Pseudo randomly-formed groups compose a large number of modules of feed-forward decision-making systems. The evolution mechanism forms collections of the best modules for each single short time period. Final solutions are carried forward to the global scale where a collection of the best modules is chosen using a multiclass cost-sensitive perceptron. Co...
international conference on image analysis and processing | 2005
Aistis Raudys
We investigate accuracy, neural network complexity and sample size problem in multilayer perceptron (MLP) based (neuro-linear) feature extraction. For feature extraction we use weighted sums calculated in hidden units of the MLP based classifier. Extracted features are utilized for data visualisation in 2D and 3D spaces and interactive formation of the pattern classes. We show analytically how complexity of feature extraction algorithm depends on the number of hidden units. Sample size – complexity relations investigated in this paper showed that reliability of the neuro-linear feature extraction could become extremely low if number of new features is too high. Visual interactive inspection of data projection may help an investigator to look differently at the forecasting problem of the financial time series.
international conference on information and software technologies | 2013
Aistis Raudys; Vaidotas Lenčiauskas; Edmundas Malčius
For a long time moving averages has been used for a financial data smoothing. It is one of the first indicators in technical analysis trading. Many traders debated that one moving average is better than other. As a result a lot of moving averages have been created. In this empirical study we overview 19 most popular moving averages, create a taxonomy and compare them using two most important factors – smoothness and lag. Smoothness indicates how much an indicator change (angle) and lag indicates how much moving average is lagging behind the current price. The aim is to have values as smooth as possible to avoid erroneous trades and with minimal lag – to increase trend detection speed. This large-scale empirical study performed on 1850 real-world time series including stocks, ETF, Forex and futures daily data demonstrate that the best smoothness/lag ratio is achieved by the Exponential Hull Moving Average (with price correction) and Triple Exponential Moving Average (without correction).
intelligent systems design and applications | 2011
Sarunas Raudys; Aistis Raudys
To use human factors together with financial ones in portfolio management task we analyze lengthy series of successes and losses of numerous automated high frequency trading systems that buy and sell assets. We found that in spite of sparse, bimodal non-Gaussian time series, modern Markowitz solutions can be applied to weigh up contributions of diverse trading strategies. Training history should be rather short in situations where technological, social, financial, economic and political situations are changing swiftly. The Markowitz portfolio coefficients finding algorithm can be improved by careful application of the regularization and matrix structurization methods.
biomedical engineering and informatics | 2012
Aistis Raudys; Zidrina Pabarskaite
Markowitzs mean-variance portfolio optimisation is not suitable for a large number of assets due to the unacceptably slow quadratic optimisation procedure involved. This is particularly important in systematic/algorithmic/automated trading applications where instead of assets, automated trading systems are used. We propose a much faster heuristic approach that scales linearly rather than the quadratic scaling in the Markowitz method. Moreover, our proposed approach, Comgen, is on average better than the Markowitz approach when applied to unseen data. Additionally, Comgen always finds a solution, whereas the Markowitz procedure occasionally fails as the covariance matrix is not always positive-semidefinite. In an empirical study of a ~2000 day history, we demonstrate the benefits of this novel approach by using ~3200 time series produced by automatic trading systems. We perform a 3 year walk-forward analysis and show that in most of the 12́3=36 months out of the sample periods, this novel approach produces a better Sharpe ratio than the Markowitz approach, at the same time being a thousand times faster (and 2400 times faster if number of assets is 4000).
Lecture Notes in Computer Science | 1998
Aistis Raudys
A new nonparametric feature mapping technique for pattern classification is proposed and compared experimentally with a principal component and Sammons mapping methods. We use the mapped training-.set vectors for an active weights initialization of the multilauer perception classier in a (wo-variate mapped .space. Simulations have shown a usefulness of the proposed weights initialization method for designing the pereeptrons when we need to obtain highly nonlinear decision boundaries.
ieee conference on computational intelligence for financial engineering economics | 2014
Aistis Raudys
Moving averages (MAs) are widely used in finance by trend followers. Negative weight (corrective) moving averages negatively weight old history and attach more weight to recent history in order to achieve better fit. After analysing such methods we propose an optimal weighting scheme for smoothing stock price data. For a given smoothness level we minimise fitting error. Differently from existing methods, which have predefined weights, optimal weights are optimised for a set of stocks to achieve the best smoothness and fit ratio. Empirical evaluation of around 2000 real-world stocks from the NASDAQ and NYSE exchanges demonstrate that a novel moving average is better than other moving averages in 90% of cases. Some additional improvements can be made to improve it further, especially for longer periods. Additionally we discovered that negative weights have quite a small influence on the overall performance of moving averages. Optimised moving average weights consist of only 0% to 12% of negative weights.