Karla Hoffman
George Mason University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Karla Hoffman.
Operations Research | 1987
Carl M. Harris; Karla Hoffman; Patsy B. Saunders
The Internal Revenue Service IRS toll-free, nationwide telephone system provides prompt tax-information assistance. In 1986, the IRS processed 37.8 million calls from taxpayers at 32 answering sites. This paper documents a critical review of the IRS approach to allocating its staff and equipment. We built a simulation-based model to test various allocation policies for deploying IRS resources. The simulation study included detailed sensitivity analysis on key network variables, and showed the feasibility of modeling a typical IRS location as a multiserver loss/delay queue with retrial and reneging. The second phase of this effort therefore centered around developing a prototype probabilistic model for determining the most effective way of providing service at reasonable levels and at minimum cost. The resulting model allows the IRS to determine from tables the best configuration of people and telephone lines for any expected levels of incoming traffic. In addition, we provided flow balance analyses of the underlying feedback queues that permit the IRS to separate their caller streams into fresh and repeat callers, and thus to estimate actual demand for service.
Journal of Computational and Applied Mathematics | 2000
Karla Hoffman
Our ability to solve large, important combinatorial optimization problems has improved dramatically in the past decade. The availability of reliable software, extremely fast and inexpensive hardware and high-level languages that make the modeling of complex problems much faster have led to a much greater demand for optimization tools. This paper highlights the major breakthroughs and then describes some very exciting future opportunities. Previously, large research projects required major data collection efforts, expensive mainframes and substantial analyst manpower. Now, we can solve much larger problems on personal computers, much of the necessary data is routinely collected and tools exist to speed up both the modeling and the post-optimality analysis. With the information-technology revolution taking place currently, we now have the opportunity to have our tools embedded into supply-chain systems that determine production and distribution schedules, process-design and location-allocation decisions. These tools can be used industry-wide with only minor modifications being done by each user.
Operations Research | 2008
Martin Durbin; Karla Hoffman
We report on the application of operations research to a very complex scheduling and dispatching problem. Scheduling and dispatching are never easy, but the scheduling of concrete deliveries is particularly difficult for several reasons: (1) concrete is an extremely perishable product---it can solidify in the truck if offloading is delayed by a few hours; (2) customer orders are extremely unpredictable and volatile---orders are often canceled or drastically changed at the last minute; (3) the concrete company overbooks by as much as 20% to compensate for customer unpredictability; (4) many orders require synchronized deliveries by multiple trucks; (5) when a truck arrives at a customer site, the customer may not be ready for the delivery, or a storm may negate the ability to use the concrete; and (6) most of the travel takes place in highly congested urban areas, making travel times highly variable. To assist the dispatchers, schedulers, and order takers at this company, we designed and implemented a decision-support tool consisting of both planning and execution tools. The modules determine whether new orders should be accepted, when drivers should arrive for work, the real-time assignment of drivers to delivery loads, the dispatching of these drivers to customers and back to plants, and the scheduling of the truck loadings at the plants. For the real-time dispatching and order-taking decisions, optimization models are solved to within 1% of optimality every five minutes throughout the day. This nearly continuous reoptimization of the entire system allows quick reactions to changes. The modeling foundation is a time-space network with integer side constraints. We describe each of the models and explain how we handle imperfect data. We also detail how we overcome a variety of implementation issues. The success of this project can be measured, most importantly, by the fact that the tool is being ported by the parent company, Florida Rock, to each of its other ready-mix concrete companies. Second, the corporation is sufficiently convinced of its importance that they have begun promoting this methodology as a “best practice” at the World of Concrete and ConAgg industry conventions.
European Journal of Operational Research | 1986
Karla Hoffman; Carl M. Harris
Abstract As part of a continuing study of the usage of its Taxpayer Service Telephone Network, the U.S. Internal Revenue Service wished to determine more accurate methods for demand measurement. It has long been recognized that the total number of calls coming into such a busy telephone system overestimates the actual number of distinct callers. The Service had previously estimated its real demand by adding ( 1 3 ) of both the number of blocked or overflow calls and the number of abandonments to the total actually answered. The thrust of this current study then was to develop an accurate statistical method for providing a more objective formula for this true demand, which turns out to be equivalent to estimating the probability of retrial by blocked and abandoned callers. The major result which has come from this effort is that the average daily retrial percentage taken across location and time of year seems to be moderately stable about a mean value of 69%, somewhat dependent on both location and (particularly) time of year. The value is consistently higher during periods close to important filing milestones and lower otherwise. We show this to mean that, whenever a rate of 69% is used, the actual demand would be estimated by augmenting completed loads by 31% of the number of blocked and abandoned calls for the period of concern.
Annals of Operations Research | 1990
Timothy L. Cannon; Karla Hoffman
We present a methodology which uses a collection of workstations connected by an Ethernet network as a parallel processor for solving large-scale linear programming problems. On the largest problems we tested, linear and super-linear speedups have been achieved. Using the “branch-and-cut” approach of Hoffman, Padberg and Rinaldi, eight workstations connected in parallel solve problems from the test set documented in the Crowder, Johnson and Padberg 1983Operations Research article. Very inexpensive, networked workstations are now solving in minutes problems which were once considered not solvable in economically feasible times. In this peer-to-peer (as opposed to master-worker) implementation, interprocess communication was accomplished by using shared files and resource locks. Effective communication between processes was accomplished with a minimum of overhead (never more than 8% of total processing time). The implementation procedures and computational results will be presented.
Transportation Planning and Technology | 2008
Loan Le; George L. Donohue; Karla Hoffman; Chun-Hung Chen
Abstract In the United States, most airports do not place any limitations on airline schedules. At a few major airports, the current scheduling restrictions (mostly administrative measures) have not been sufficiently strict to avoid consistent delays and have raised debates about both the efficiency and the fairness of the allocations. With a forecast of 1.1 billion yearly air travelers within the US by 2015, airport expansion and technology enhancement alone are not enough to cope with the competition-driven scheduling practices of the airline industry. The policy legacy needs to change to be consistent with airport capacities. Flights on US airlines arrived late more often in the first four months of 2007 than in any other year since the government began tracking delays, and flight cancellations increased 91% over 2006. With a forecast of 1.1 billion yearly air travelers within the US by 2015, airport expansion and technology enhancement alone are not enough to cope with the competition-driven scheduling practices of the airline industry. Our research studies how flight schedules might change if airlines were required to restrict their schedules to runway capacity. To obtain these schedules, we model a profit-seeking, single benevolent airline whose goal is to maintain current competitive prices and service as many current passengers as possible, while remaining profitable. Our case study demonstrates that at Instrument Meteorological Conditions (IMC) runway rates, the market can find profitable flight schedules that reduce substantially the average flight delay to less than 6 minutes while simultaneously satisfying virtually all of the current demand with average prices remaining unchanged. This is accomplished through significant upgauging to high-demand markets.
Decision Analysis | 2010
Karla Hoffman; Dinesh Menon
A centralized combinatorial exchange has been considered as a means to enable efficient restructuring of spectrum holdings by allowing traders to buy and sell spectrum resources. We propose a new, two-sided, multiple-round, combinatorial clock exchange mechanism that enables traders to specify reserve prices and submit consolidated bundle orders on spectrum assets to be bought and sold. Any trader may submit multiple orders over several rounds. A trading agents order can only be executed when it matches with bids and asks from one or more other agents. Acceptable bid and ask prices are adjusted each round to decrease the spread, and the final, market-clearing trades are executed to maximize the gains from trade. This paper outlines the clock mechanism and presents a new approach for allocation of surplus among the market-clearing agents to maximize incentives for participation.
Annals of Operations Research | 1995
Carl M. Harris; Karla Hoffman; Leslie-Ann Yarrow
Latin hypercube sampling is often used to estimate the distribution function of a complicated function of many random variables. In so doing, it is typically necessary to choose a permutation matrix which minimizes the correlation among the cells in the hypercube layout. This problem can be formulated as a generalized, multi-dimensional assignment problem. For the two-dimensional case, we provide a polynomial algorithm. For higher dimensions, we offer effective heuristic and bounding procedures.
integrated communications, navigation and surveillance conference | 2009
John Ferguson; Karla Hoffman; Lance Sherry; Abdul Qadar Kara
Industry strategists, government regulators, and the media have focused on addressing concerns over the performance of the air transportation system with respect to delays. One of the strategies proposed has been to limit the scheduled operations at an airport to a priori feasible capacity limits. This approach has been criticized on the basis that it would reduce the number of markets served and increase airfares.
Archive | 2006
Karla Hoffman
This paper summarizes a talk given in honor of Saul Gass’ 80th Birthday celebration. The paper is modeled after Saul’s well-known book, An Illustrated Guide to Linear Programming, and presents some of the illustrations provided during that talk. In this paper, we explain why specific rules might be chosen within a general combinatorial auction framework. The purpose of such rules is to assure that the market mechanism is fair to both buyers and sellers, and so that the auction will end in an efficient outcome, i.e., the goods are won by those that value them the most. The paper describes some of the issues, both computational and economic, that one faces when designing such auctions.