Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Irad Ben-Gal is active.

Publication


Featured researches published by Irad Ben-Gal.


Technometrics | 2003

Context-Based Statistical Process Control

Irad Ben-Gal; Gail Morag; Armin Shmilovici

Most statistical process control (SPC) methods are not suitable for monitoring nonlinear and state-dependent processes. This article introduces the context-based SPC (CSPC) methodology for state-dependent data generated by a finite-memory source. The key idea of the CSPC is to monitor the statistical attributes of a process by comparing two context trees at any monitoring period of time. The first is a reference tree that represents the “in control” reference behavior of the process; the second is a monitored tree, generated periodically from a sample of sequenced observations, that represents the behavior of the process at that period. The Kullback–Leibler (KL) statistic is used to measure the relative “distance” between these two trees, and an analytic distribution of this statistic is derived. Monitoring the KL statistic indicates whether there has been any significant change in the process that requires intervention. An example of buffer-level monitoring in a production system demonstrates the viability of the new method with respect to conventional methods.


Iie Transactions | 2002

The ergonomic design of workstations using virtual manufacturing and response surface methodology

Irad Ben-Gal; Joseph Bukchin

Abstract The increasing use of computerized tools for virtual manufacturing in workstatin design has two main advantages over traditional methods first it enables the designer to examine a large number of design solutions; and second, simulation of the work task may be performedin order to obtain the values of various performance measures. In this paper a ne~ structural. methodology for the workstation design is presented. Factorial experiments and the response surface methodology are integrated 111 order to reduce the number of examined design solutions and obtain an estimate for the best design configuration With respect to multi-objective requirements.


Iie Transactions | 2006

Designing experiments for robust-optimization problems: the V s -optimality criterion

Hilla Ginsburg; Irad Ben-Gal

We suggest an experimentation strategy for the robust design of empirically fitted models. The suggested approach is used to design experiments that minimize the variance of the optimal robust solution. The new design-of-experiment optimality criterion, termed V s-optimal, prioritizes the estimation of a models coefficients, such that the variance of the optimal solution is minimized by the performed experiments. It is discussed how the proposed criterion is related to known optimality criteria. We present an analytical formulation of the suggested approach for linear models and a numerical procedure for higher-order or nonpolynomial models. In comparison with conventional robust-design methods, our approach provides more information on the robust solution by numerically generating its multidimensional distribution. Moreover, in a case study, the proposed approach results in a better robust solution in comparison with these standard methods.


IEEE Transactions on Reliability | 2005

On the use of data compression measures to analyze robust designs

Irad Ben-Gal

In this paper, we suggest a potential use of data compression measures, such as the Entropy, and the Huffman Coding, to assess the effects of noise factors on the reliability of tested systems. In particular, we extend the Taguchi method for robust design by computing the entropy of the percent contribution values of the noise factors. The new measures are computed already at the parameter-design stage, and together with the traditional S/N ratios enable the specification of a robust design. Assuming that (some of) the noise factors should be naturalized, the entropy of a design reflects the potential efforts that will be required in the tolerance-design stage to reach a more reliable system. Using a small example, we illustrate the contribution of the new measure that might alter the designer decision in comparison with the traditional Taguchi method, and ultimately obtain a system with a lower quality loss. Assuming that the percent contribution values can reflect the probability of a noise factor to trigger a disturbance in the system response, a series of probabilistic algorithms can be applied to the robust design problem. We focus on the Huffman coding algorithm, and show how to implement this algorithm such that the designer obtains the minimal expected number of tests in order to find the disturbing noise factor. The entropy measure, in this case, provides the lower bound on the algorithms performance.


Iie Transactions | 2005

Economic optimization of off-line inspection in a process subject to failure and recovery

Alexander Finkelshtein; Yale T. Herer; Tzvi Raz; Irad Ben-Gal

In certain types of processes, verification of the quality of the output units is possible only after the entire batch has been processed. We develop a model that prescribes which units should be inspected and how the units that were not inspected should be disposed of, in order to minimize the expected sum of inspection costs and disposition error costs, for processes that are subject to random failure and recovery. The model is based on a dynamic programming algorithm that has a low computational complexity. The study also includes a sensitivity analysis under a variety of cost and probability scenarios, supplemented by an analysis of the smallest batch that requires inspection, the expected number of inspections, and the performance of an easy to implement heuristic.


Iie Transactions | 2008

Robust eco-design: A new application for air quality engineering

Irad Ben-Gal; Roni Katz; Yossi Bukchin

The method of robust design has long been used for the design of systems that are insensitive to noises. In this paper it is demonstrated how this approach can be used to obtain a robust eco-design (ecological design). In a case study, robust design principles are applied to the design of a factory smokestack, using the Gaussian Plume Model (GPM). The GPM is a well-known model for describing pollutant dispersal from a point source, subject to various atmospheric conditions. In this research, the mean-square-error (MSE) of the accumulated and the maximum pollution values around a given target are defined as the performance measures and used to adjust the design parameters. Both analytical and numerical approaches are used to evaluate the MSE measures over the design space. It is demonstrated how to use the non-linearity in the GPM to reach a low MSE value that produces a cheaper design configuration. The differences between the manufacturer viewpoint and the environmentalist viewpoint with respect to the considered eco-design problem are discussed and analyzed.


Iie Transactions | 2002

Sequential DOE via dynamic programming

Irad Ben-Gal; Michael C. Caramanis

The paper considers a sequential Design Of Experiments (DOE) scheme. Our objective is to maximize both information and economic measures over a feasible set of experiments. Optimal DOE strategies are developed by introducing information criteria based on measures adopted from information theory. The evolution of acquired information along various stages of experimentation is analyzed for linear models with a Gaussian noise term. We show that for particular cases, although the amount of information is unbounded, the desired rate of acquiring information decreases with the number of experiments. This observation implies that at a certain point in time it is no longer efficient to continue experimenting. Accordingly, we investigate methods of stochastic dynamic programming under imperfect state information as appropriate means to obtain optimal experimentation policies. We propose cost-to-go functions that model the trade-off between the cost of additional experiments and the benefit of incremental information. We formulate a general stochastic dynamic programming framework for design of experiments and illustrate it by analytic and numerical implementation examples.


Quality Technology and Quantitative Management | 2014

Efficient Construction of Decision Trees by the Dual Information Distance Method

Irad Ben-Gal; Alexandra Dana; Niv Shkolnik; Gonen Singer

Abstract The construction of efficient decision and classification trees is a fundamental task in Big Data analytics which is known to be NP-hard. Accordingly, many greedy heuristics were suggested for the construction of decision-trees, but were found to result in local-optimum solutions. In this work we present the dual information distance (DID) method for efficient construction of decision trees that is computationally attractive, yet relatively robust to noise. The DID heuristic selects features by considering both their immediate contribution to the classification, as well as their future potential effects. It represents the construction of classification trees by finding the shortest paths over a graph of partitions that are defined by the selected features. The DID method takes into account both the orthogonality between the selected partitions, as well as the reduction of uncertainty on the class partition given the selected attributes. We show that the DID method often outperforms popular classifiers, in terms of average depth and classification accuracy.


Quality Technology and Quantitative Management | 2012

Efficient Bayesian Network Learning for System Optimization in Reliability Engineering

A. Gruber; Irad Ben-Gal

Abstract We present a new Bayesian network modeling that learns the behavior of an unknown system from real data and can be used for reliability engineering and optimization processes in industrial systems. The suggested approach relies on quantitative criteria for addressing the trade-off between the complexity of a learned model and its prediction accuracy. These criteria are based on measures from Information Theory as they predetermine both the accuracy as well as the complexity of the model. We illustrate the proposed method by a classical example of system reliability engineering. Using computer experiments, we show how in a targeted Bayesian network learning, a tremendous reduction in the model complexity can be accomplished, while maintaining most of the essential information for optimizing the system.


IEEE Transactions on Learning Technologies | 2011

Evaluation of Telerobotic Interface Components for Teaching Robot Operation

Ofir H. Goldstain; Irad Ben-Gal; Yossi Bukchin

Remote learning has been an increasingly growing field in the last two decades. The Internet development, as well as the increase in PCs capabilities and bandwidth capacity, has made remote learning through the internet a convenient learning preference, leading to a variety of new interfaces and methods. In this work, we consider a remote learning interface, developed in a Computer Integrated Manufacturing (CIM) Laboratory, and evaluate the contribution of different interface components to the overall performance and learning ability of end users. The evaluated components are the control method of the robotic arm and the use of a three-dimensional simulation tool before and during the execution of a robotic task. An experiment is designed and executed, comparing alternative interface designs for remote learning of robotic operation. A teleoperation task was given to 120 engineering students through five semesters. The number of steps required for completing the task, the number of errors during the execution, and the improvement rate during the execution were measured and analyzed. The results provide guidelines for a better design of an interface for remote learning of robotic operation. The main contribution of this paper is in the introduction of a new teaching tool for laboratories and the supplied guidelines for an efficient design of such tools.

Collaboration


Dive into the Irad Ben-Gal's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Armin Shmilovici

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge