Bahadır Yüzbaşı
İnönü University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Bahadır Yüzbaşı.
Journal of Statistical Computation and Simulation | 2016
Bahadır Yüzbaşı; S. Ejaz Ahmed
ABSTRACT In this paper, we consider estimation techniques based on ridge regression when the matrix appears to be ill-conditioned in the partially linear model using kernel smoothing. Furthermore, we consider that the coefficients can be partitioned as where is the coefficient vector for main effects, and is the vector for ‘nuisance’ effects. We are essentially interested in the estimation of when it is reasonable to suspect that is close to zero. We suggest ridge pretest, ridge shrinkage and ridge positive shrinkage estimators for the above semi-parametric model, and compare its performance with some penalty estimators. In particular, suitability of estimating the nonparametric component based on the kernel smoothing basis function is also explored. Monte Carlo simulation study is used to compare the relative efficiency of proposed estimators, and a real data example is presented to illustrate the usefulness of the suggested methods. Moreover, the asymptotic properties of the proposed estimators are obtained.
international journal of management science and engineering management | 2016
S. Ejaz Ahmed; Bahadır Yüzbaşı
We present efficient estimation and prediction strategies for the classical multiple regression model when the dimensions of the parameters are larger than the number of observations. These strategies are motivated by penalty estimation and Stein-type estimation procedures. More specifically, we consider the estimation of regression parameters in sparse linear models when some of the predictors may have a very weak influence on the response of interest. In a high-dimensional situation, a number of existing variable selection techniques exists. However, they yield different subset models and may have different numbers of predictors. Generally speaking, the least absolute shrinkage and selection operator (Lasso) approach produces an over-fitted model compared with its competitors, namely the smoothly clipped absolute deviation (SCAD) method and adaptive Lasso (aLasso). Thus, prediction based only on a submodel selected by such methods will be subject to selection bias. In order to minimize the inherited bias, we suggest combining two models to improve the estimation and prediction performance. In the context of two competing models where one model includes more predictors than the other based on relatively aggressive variable selection strategies, we plan to investigate the relative performance of Stein-type shrinkage and penalty estimators. The shrinkage estimator improves the prediction performance of submodels significantly selected from existing Lasso-type variable selection methods. A Monte Carlo simulation study is carried out using the relative mean squared error (RMSE) criterion to appraise the performance of the listed estimators. The proposed strategy is applied to the analysis of several real high-dimensional data sets.
Archive | 2015
Bahadır Yüzbaşı; S. Ejaz Ahmed
In this paper, we suggest shrinkage ridge regression estimators for a multiple linear regression model, and compared their performance with some penalty estimators which are lasso, adaptive lasso and SCAD. Monte Carlo studies were conducted to compare the estimators and a real data example is presented to illustrate the usefulness of the suggested methods.
international conference on management science and engineering | 2018
Yasin Asar; S. Ejaz Ahmed; Bahadır Yüzbaşı
When there is an excess amount of zeros and over-dispersion in the dependent variable, the zero-inflated Poisson regression is usually used to model the data. In most of the situations, researchers may encounter near linear dependencies in the exploratory variables which leads to the collinearity problem. Therefore, we propose to use Liu-type estimator to overcome this problem. We compare our method to the well-known ridge estimator via a Monte Carlo simulation study and real data examples. According to the results, our method is a better alternative in the presence of collinearity.
Archive | 2017
Syed Ejaz Ahmed; Bahadır Yüzbaşı
We consider an efficient prediction in sparse high dimensional data. In high dimensional data settings where d ≫ n, many penalized regularization strategies are suggested for simultaneous variable selection and estimation. However, different strategies yield a different submodel with d i < n, where d i represents the number of predictors included in ith submodel. Some procedures may select a submodel with a larger number of predictors than others. Due to the trade-off between model complexity and model prediction accuracy, the statistical inference of model selection becomes extremely important and challenging in high dimensional data analysis. For this reason we suggest shrinkage and pretest strategies to improve the prediction performance of two selected submodels. Such a pretest and shrinkage strategy is constructed by shrinking an overfitted model estimator in the direction of an underfitted model estimator. The numerical studies indicate that our post-selection pretest and shrinkage strategy improved the prediction performance of selected submodels.
Journal for The Study of Religions and Ideologies | 2015
Serkan Benk; Robert W. McGee; Bahadır Yüzbaşı
Economic Modelling | 2013
Selim Kayhan; Tayfur Bayat; Bahadır Yüzbaşı
Statistical Papers | 2017
Bahadır Yüzbaşı; S. Ejaz Ahmed; Dursun Aydin
Thermal Science | 2017
Bahadır Yüzbaşı; Yasin Asar; M Samil Sik; Ahmet Demiralp
Religion | 2016
Serkan Benk; Tamer Budak; Bahadır Yüzbaşı; Raihana Mohdali