Pakize Taylan
Dicle University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Pakize Taylan.
Inverse Problems in Science and Engineering | 2012
Gerhard-Wilhelm Weber; İnci Batmaz; Gülser Köksal; Pakize Taylan; Fatma Yerlikaya-Özkurt
Regression analysis is a widely used statistical method for modelling relationships between variables. Multivariate adaptive regression splines (MARS) especially is very useful for high-dimensional problems and fitting nonlinear multivariate functions. A special advantage of MARS lies in its ability to estimate contributions of some basis functions so that both additive and interactive effects of the predictors are allowed to determine the response variable. The MARS method consists of two parts: forward and backward algorithms. Through these algorithms, it seeks to achieve two objectives: a good fit to the data, but a simple model. In this article, we use a penalized residual sum of squares for MARS as a Tikhonov regularization problem, and treat this with continuous optimization technique, in particular, the framework of conic quadratic programming. We call this new approach to MARS as CMARS, and consider it as becoming an important complementary and model-based alternative to the backward stepwise algorithm. The performance of CMARS is also evaluated using different data sets with different features, and the results are discussed.
Optimization | 2007
Pakize Taylan; Gerhard-Wilhelm Weber; A. Beck
Generalized additive models belong to modern techniques from statistical learning, and are applicable in many areas of prediction, e.g. in financial mathematics, computational biology, medicine, chemistry and environmental protection. In these models, the expectation of response is linked to the predictors via a link function. These models are fitted through local scoring algorithm using a scatterplot smoother as building blocks proposed by Hastie and Tibshirani (1987). In this article, we first give a short introduction and review. Then, we present a mathematical modeling by splines based on a new clustering approach for the x, their density, and the variation of output y. We contribute to regression with generalized additive models by bounding (penalizing) second-order terms (curvature) of the splines, leading to a more robust approximation. Previously, in [23], we proposed a refining modification and investigation of the backfitting algorithm, applied to additive models. Then, because of drawbacks of the modified backfitting algorithm, we solve this problem using continuous optimization techniques, which will become an important complementary technology and alternative to the concept of modified backfitting algorithm. In particular, we model and treat the constrained residual sum of squares by the elegant framework of conic quadratic programming. ¶This study was carried out as part of Pakize Taylans postdoc at METU in the program DOSAP.
Discrete Applied Mathematics | 2009
Gerhard-Wilhelm Weber; Ömür Uğur; Pakize Taylan; Aysun Tezel
An emerging research area in computational biology and biotechnology is devoted to mathematical modeling and prediction of gene-expression patterns; to fully understand its foundations requires a mathematical study. This paper surveys and mathematically expands recent advances in modeling and prediction by rigorously introducing the environment and aspects of errors and uncertainty into the genetic context within the framework of matrix and interval arithmetic. Given the data from DNA microarray experiments and environmental measurements we extract nonlinear ordinary differential equations which contain parameters that are to be determined. This is done by a generalized Chebychev approximation and generalized semi-infinite optimization. Then, time-discretized dynamical systems are studied. By a combinatorial algorithm which constructs and follows polyhedra sequences, the region of parametric stability is detected. Finally, we analyze the topological landscape of gene-environment networks in terms of structural stability. This pioneering work is practically motivated and theoretically elaborated; it is directed towards contributing to applications concerning better health care, progress in medicine, a better education and more healthy living conditions.
Optimization | 2008
Gerhard-Wilhelm Weber; Aysun Tezel; Pakize Taylan; Alper Soyler; Mehmet Cetin
This article contributes to a further introduction of continuous optimization in the field of computational biology which is one of the most challenging and emerging areas of science, in addition to foundations presented and the state-of-the-art displayed in [C.A. Floudas and P.M. Pardalos, eds., Optimization in Computational Chemistry and Molecular Biology: Local and Global Approaches, Kluwer Academic Publishers, Boston, 2000]. Based on a summary of earlier works by the coauthors and their colleagues, it refines the model on gene-environment patterns by a problem from generalized semi-infinite programming (GSIP), and characterizes the condition of its structural stability. Furthermore, our paper tries to detect and understand structural frontiers of our methods applied to the recently introduced gene-environment networks and tries to overcome them. Computational biology is interdisciplinary, but it also looks for its mathematical foundations. From data got by DNA microarray experiments, non-linear ordinary differential equations are extracted by the optimization of least-squares errors; then we derive corresponding time-discretized dynamical systems. Using a combinatorial algorithm with polyhedra sequences we can detect the regions of parametric stability, contributing to a testing the goodness of data fitting of the model. To represent and interpret the dynamics, certain matrices, genetic networks and, more generally, gene-environment networks serve. Here, we consider n genes in possible dependence with m special environmental factors and a cumulative one. These networks are subject of discrete mathematical questions, but very large structures, such that we need to simplify them. This is undertaken in a careful optimization with constraints, aiming at a balanced connectedness, incorporates any type of a priori knowledge or request and should be done carefully enough to be robust against disturbation by the environment. In this way, we take into account attacks on the network, knockout phenomena and catastrophies, but also changes in lifestyle and effects of education as far as they can approximately be quantified. We characterize the structural stability of the GSIP problem against perturbations like changes between data series or due to outliers. We give explanations on the numerics and the use of splines. This study is an attempt to demonstrate some beauty and applicabilty of continuous optimization which might together one day give a support in health care, food engineering, biomedicine and -technology, including elements of bioenergy and biomaterials. †Dedicated to Prof. Dr H. Th. Jongen on the occasion of his 60th birthday in admiration of his scientific work.
POWER CONTROL AND OPTIMIZATION: Proceedings of the 3rd Global Conference on Power Control and Optimization | 2010
Pakize Taylan; Gerhard-Wilhelm Weber; Lian Liu
Generalized linear models are widely‐used statistical techniques. As an extension, generalized partial linear models utilize semiparametric methods and augment the usual parametric terms by a single nonparametric component of a continuous covariate. In this paper, after a short introduction, we present our model in the generalized additive context with a focus on penalized maximum likelihood and on the penalized iteratively reweighted least squares (P‐IRLS) problem based on B‐splines which is attractive for nonparametric components. Then, we approach solving the P‐IRLS problem using continuous optimization techniques. They become an important complementary technology and alternative to the penalty methods with the flexibility of choosing the penalty parameter adaptively. In particular, we model and treat the constrained P‐IRLS problem by the elegant framework of conic quadratic programming. This paper is of a more theoretical nature and a preparation of real‐world applications in future.
Organizacija | 2008
Pakize Taylan; Gerhard-Wilhelm Weber
Organization in Finance Prepared by Stochastic Differential Equations with Additive and Nonlinear Models and Continuous Optimization A central element in organization of financal means by a person, a company or societal group consists in the constitution, analysis and optimization of portfolios. This requests the time-depending modeling of processes. Likewise many processes in nature, technology and economy, financial processes suffer from stochastic fluctuations. Therefore, we consider stochastic differential equations (Kloeden, Platen and Schurz, 1994) since in reality, especially, in the financial sector, many processes are affected with noise. As a drawback, these equations are hard to represent by a computer and hard to resolve. In our paper, we express them in simplified manner of approximation by both a discretization and additive models based on splines. Our parameter estimation refers to the linearly involved spline coefficients as prepared in (Taylan and Weber, 2007) and the partially nonlinearly involved probabilistic parameters. We construct a penalized residual sum of square for this model and face occuring nonlinearities by Gauss-Newtons and Levenberg-Marquardts method on determining the iteration step. We also investigate when the related minimization program can be written as a Tikhonov regularization problem (sometimes called ridge regression), and we treat it using continuous optimization techniques. In particular, we prepare access to the elegant framework of conic quadratic programming. These convex optimation problems are very well-structured, herewith resembling linear programs and, hence, permitting the use of interior point methods (Nesterov and Nemirovskii, 1993).
Computers & Mathematics With Applications | 2010
Pakize Taylan; Gerhard-Wilhelm Weber; Lian Liu; Fatma Yerlikaya-Özkurt
Generalized linear models are widely used in statistical techniques. As an extension, generalized partial linear models utilize semiparametric methods and augment the usual parametric terms with a single nonparametric component of a continuous covariate. In this paper, after a short introduction, we present our model in the generalized additive context with a focus on the penalized maximum likelihood and the penalized iteratively reweighted least squares (P-IRLS) problem based on B-splines, which is attractive for nonparametric components. Then, we approach solving the P-IRLS problem using continuous optimization techniques. They have come to constitute an important complementary approach, alternative to the penalty methods, with flexibility for choosing the penalty parameter adaptively. In particular, we model and treat the constrained P-IRLS problem by using the elegant framework of conic quadratic programming. The method is illustrated using a small numerical example.
Archive | 2011
Gerhard-Wilhelm Weber; Pakize Taylan; Z.-K. Görgülü; H. Abd. Rahman; A. Bahar
Financial processes as processes in nature, are subject to stochastic fluctuations. Stochastic differential equations turn out to be an advantageous representation of such noisy, real-world problems, and together with their identification, they play an important role in the sectors of finance, but also in physics and biotechnology. These equations, however, are often hard to represent and to resolve. Thus we express them in a simplified manner of approximation by discretization and additive models based on splines. This defines a trilevel problem consisting of an optimization and a representation problem (portfolio optimization), and a parameter estimation (Weber et al. Financial Regression and Organization. In: Special Issue on Optimization in Finance, DCDIS-B, 2010). Two types of parameters dependency, linear and nonlinear, are considered by constructing a penalized residual sum of squares and investigating the related Tikhonov regularization problem for the first one. In the nonlinear case Gauss–Newton’s method and Levenberg–Marquardt’s method are employed in determining the iteration steps. Both cases are treated using continuous optimization techniques by the elegant framework of conic quadratic programming. These convex problems are well-structured, hence, allowing the use of the efficient interior point methods. Furthermore, we present nonparametric and related methods, and introduce into research done at the moment in our research groups which ends with a conclusion.
intelligent data analysis | 2014
Pakize Taylan; Fatma Yerlikaya-Özkurt; Gerhard-Wilhelm Weber
In statistical research, regression models based on data play a central role; one of these models is the linear regression model. However, this model may give misleading results when data contain outliers. The outliers in linear regression can be resolved in two stages: by using the Mean Shift Outlier Model MSOM and by providing a new solution for this model. First, we construct a Tikhonov regularization problem for the MSOM. Then, we treat this problem using convex optimization techniques, specifically conic quadratic programming, permitting the use of interior point methods. We present numerical examples, which reveal very good results, and we conclude with an outlook to future studies.
Archive | 2019
Pakize Taylan; Gerhard Wilhelm Weber
Multivariate adaptive regression spline (MARS) denotes a modern methodology from statistical learning which is important in both classification and regression. It is very useful for high-dimensional problems and shows a great promise for fitting nonlinear multivariate functions by using its ability to estimate the contributions of the basis functions so that both the additive and the interactive effects of the predictors are allowed to determine the response variable. The MARS algorithm for estimating the model function consists of two sub-algorithms. In our paper, we propose not to use second algorithm. Instead, we construct a penalized residual sum of squares (PRSS) for MARS as a higher-order Tikhonov regularization problem which is also known as ridge regression that shrinks coefficients and make them more stable. But it cannot perform variable selection in the model and, hence, does not give an easily interpretable model (especially, if the number of variable p is large). For this reason, we change the Tikhonov penalty function with the generalized Lasso penalty for solving the problem PRSS, taking an advantage for feature selection. We treat this problem using continuous optimization techniques which we consider to become an important complementary technology and model-based alternative to the concept of the backward stepwise algorithm. In particular, we apply the elegant framework of conic quadratic programming (CQP), and we call the solution as CG-Lasso. Here, we gain from an area of convex optimization whose programs are very well-structured, herewith, resembling linear programming and, hence, permitting the use of powerful interior point methods (IPMs).