Jacob Zahavi
Tel Aviv University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jacob Zahavi.
Journal of Direct Marketing | 1997
Jacob Zahavi; Nissan Levin
Database marketing uses the power of data and information technology in the pursuit of personal marketing of products and services to consumers, based on their preferences and needs. We explore the feasibility of using neural computing as a means for targeting audiences for promotion through the mail, from among a list of customers in a database, either as an alternative and/or as a supplement to discrete-choice logistic regression models. Detailed numerical examples involving realistic data are used throughout to support the analysis and demonstrate the results. It is shown that, at least for the data used in this study the fit achieved for both methods is approximately the same, but the process of configuring and setting up a neural network for a database marketing application is not straightforward and may require extensive experimentation and computer resources. The results are therefore not encouraging for the neural net approach.
Journal of Direct Marketing | 1997
Jacob Zahavi; Nissan Levin
Applying a neural network (NN) to the targeting and prediction problems in target marketing poses some unique problems and difficulties unparalleled in other business applications of neural computations. We discuss several of these issues in this article, as applied to solo mailings, offer remedies to some, and discuss possible solutions to others. A numerical example, using NN backpropagation models and involving realistic data, is used to exemplify some of the resulting issues.
Journal of Interactive Marketing | 1998
Nissan Levin; Jacob Zahavi
Abstract We evaluate the performance of several predictive models to analyze continuous response vis-a-vis the performance of a discrete-choice logistic regression model. The models were evaluated based on three measures—profitability analysis, goodness-of-fit criteria, and prediction accuracy. The evaluation was conducted on a real application involving an honme equity loan campaign in the banking industry. The implications of the results for decision making are also discussed.
The Bell Journal of Economics | 1977
Joseph Vardi; Jacob Zahavi; Benjamin Avi-Itzhak
In contrast to traditional methods which impose capacity costs on peak customers only, it is shown that, depending upon the design criterion employed in planning for the capacity expansion of the power system, off-peak marginal cost prices should also be imputed with some marginal capacity costs. Taking the loss of load probability (LOLP) design criterion as an example, we establish this conclusion formally, and suggest an algorithm to apportion accurately marginal capacity costs to various periods. The algorithm is then extended for power systems planned to meet a given loss of energy probability (LEOP) design target. Provisions to incorporate random deviations of customers demand and maintenance requirements in the calculation process are also suggested.
Operations Research | 1983
Nissan Levin; Asher Tishler; Jacob Zahavi
We derive conditions under which the time-step, or the myopic, approach to generation capacity planning in the power industry yields solutions identical to the solution obtained by an equivalent dynamic model that views the capacity expansion program simultaneously over time. The conditions are derived for thermal power systems for which the capacity expansion program is formulated using a convex, nonlinear mathematical programming model.
Energy Economics | 1981
David Feiler; Jacob Zahavi
Abstract Marginal pricing of electricity calls for the determination of marginal energy costs over time or equivalently at any load level. This paper extends an earlier study which calculated the expected demand-related marginal energy costs to allow units to be loaded to generation in blocks. A computationally efficient procedure is developed to compute the probability that at any specified load level a given block will be the last block loaded. This then provides a straightforward method for calculating the expected marginal energy costs over time. Both complete forced outages and partial outages of units are stochastically considered.
Journal of Direct Marketing | 1996
Nissan Levin; Jacob Zahavi
Abstract Various methods are compared to calculate the regression-to-the-mean (RTM) effect in segmentation analysis, based on the results of a test mailing, distinguishing between the case of no-prior, non-parametric, and parametric knowledge on the distribution of the response rates of segments across the list. The advantages and disadvantages of each method and its implication for decision making are discussed.
IEEE Transactions on Power Systems | 1989
Michael Ganor; Jacob Zahavi
Unit reliability measures are evaluated for power plants facing partial outages and variable demand. The problem is formulated as an embedded Markov chain, and several reliability criteria are developed for a three-state unit with generally distributed demand periods and exponentially distributed operation and repair times. The model developed is applied to a sample unit using realistic data to estimate the demand, no-demand, up-time and down-time distributions. >
Energy Economics | 1987
Nissan Levin; Jacob Zahavi
Abstract In this paper we explicitly account for demand uncertainty in partial equilibrium models of the electricity sector, extending previous work in this area to allow for diverse technologies and temporal fluctuations of demand. Two approaches are discussed. In the first, prices are set ex ante, prior to the resolution of demand uncertainty, to maximize the expected social benefits; in the second, prices are set ex post, after demand uncertainty is resolved, to clear the market rather than rationing or leaving excess capacity idle.
The Data Mining and Knowledge Discovery Handbook | 2005
Nissan Levin; Jacob Zahavi
The winner of the KDD in 1997 and 1998, GainSmarts is one of leading Data Mining software packages. GainSmarts encompasses the entire range of the KDD process, including data import, exploratory data analysis, sampling, feature selection, modeling, knowledge evaluation, scoring and decision making and reporting. GainSmart is mostly noted for its feature selection process which employs a rule-based export system to automatically select the most influential predictors from a much larger set of potential predictors. The modeling suite is particularly rich and includes a variety of predictive Models — binary, multinomial, continuous and even survival analysis, as well as clustering and collaborative filtering models. The output reports are presented in both tabular and visual forms, some of them are also available in Excel form allowing the user to use Excel options to Manipulate the results and conduct sensitivity analyses. Economic criteria are imbedded in the decision making to drive decisions. GainSmarts was developed in SAS with the intensive CPU routines developed in C. It is a multi-lingual system currently available in English, Japanese and German. While developed with a marketing slant, GainSmarts generic nature makes it applicable for a variety of applications in diverse industries.