Luc Johannes Josephus Wismans
University of Twente
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Luc Johannes Josephus Wismans.
Transport Reviews | 2011
Luc Johannes Josephus Wismans; Eric C. van Berkum; Michiel C.J. Bliemer
Recently, there has been a growing interest in externalities in our society, mainly in the context of climate and air quality, which are of importance when policy decisions are made. For the assessment of externalities in transport, often the output of static traffic assignment models is used in combination with so-called effect models. Due to the rapidly increasing possibilities of using dynamic traffic assignment (DTA) models for large-scale transportation networks and the application of traffic measures, already several models have been developed to assess the externalities using DTA models more precisely. Different research projects have shown that there is a proven relation between the traffic dynamics and externalities, such as emissions of pollutants and traffic safety. This means that the assessment of external effects can be improved by using temporal information about flow, speed and density, which is the output of DTA models. In this paper, the modelling of traffic safety, emissions and noise in conjunction with DTA models is reviewed based on an extensive literature survey. This review shows that there are still gaps in knowledge in assessing traffic safety, much research is available concerning emissions, and although little research has been conducted concerning the assessment of noise using DTA models, the methods available can be used to assess the effects. Most research so far has focused on the use of microscopic models, while mesoscopic or macroscopic models may have a high potential for improving the assessment of these effects for larger networks.
Transportation Research Record | 2011
Luc Johannes Josephus Wismans; Eric C. van Berkum; Michiel C.J. Bliemer
The externalities of traffic are increasingly important for policy decisions related to design of a road network. Optimization of externalities with dynamic traffic management measures influencing the supply of infrastructure is a multiobjective network design problem, which in turn is a bi-level optimization problem. The presence of conflicting objectives makes the solution to the optimization problem a challenge. Evolutionary multiobjective algorithms have proved successful in solving such problems. However, like all optimization methods, these are subject to the no-free-lunch theorem. Therefore, this paper compares the nondominated sorting genetic algorithm II (NSGA-II), the strength Pareto evolutionary algorithm 2 (SPEA2), and the strength Pareto evolutionary algorithm 2+ (SPEA2+) to find a Pareto optimal solution set for this problem. Because incorporation of traffic dynamics is important, the lower level should be solved through a dynamic traffic assignment model, which increases needed CPU time. Therefore, algorithm performance is compared within a certain budget. The approaches are compared in a numerical experiment through different metrics. The externalities optimized are noise, climate, and congestion. The results show that climate and congestion are aligned and that both are opposed to noise in the case study. On average, SPEA2+ outperforms SPEA2 in this problem on all used measures. Results of NSGA-II and SPEA2+ are inconclusive. A larger population results on average in a larger space coverage, while a smaller population results in higher performance on spacing and diversity. Most performance measures are relatively insensitive for the mutation rate.
Journal of Intelligent Transportation Systems | 2014
Luc Johannes Josephus Wismans; Eric C. van Berkum; Michiel C.J. Bliemer
Optimization of externalities and accessibility using dynamic traffic management measures on a strategic level is a specific example of solving a multi-objective network design problem. Solving this optimization problem is time consuming, because heuristics like evolutionary multi objective algorithms are needed and solving the lower level requires solving the dynamic user equilibrium problem. Using function approximation like response surface methods (RSM) in combination with evolutionary algorithms could accelerate the determination of the Pareto optimal set. Three algorithms in which RSM are used in different ways in combination with the Strength Pareto Evolutionary Algorithm 2+ (SPEA2+) are compared with employing the SPEA2+ without the use of these methods. The results show that the algorithms using RSM methods accelerate the search considerably at the start, but tend to converge more quickly, possibly to a local optimum, and therefore loose their head start. Therefore, usage of function approximation is mainly of interest if a limited number of exact evaluations can be done or this can be used as a pre phase in a hybrid approach.
Journal of Intelligent Transportation Systems | 2013
Luc Johannes Josephus Wismans; Eric C. van Berkum; Michiel C.J. Bliemer
Optimization of traffic network performance using dynamic traffic management (DTM) measures can be viewed as a specific example of solving a network design problem (NDP). Decision variables are the specific settings of DTM measures. DTM measures have been identified as powerful instruments not only to increase network efficiency, but also to improve externalities. As a result, in the optimization the focus is not only on efficiency, but also on climate, air quality, traffic safety, and noise. These assessment criteria are determined using the output of a dynamic traffic assignment model. This results in a dynamic multi-objective NDP, which is solved as a bilevel optimization problem; it results in a Pareto optimal set. This set provides valuable information for the decision-making process, which would not have been available if the compensation principle would have been chosen in advance. Knowledge obtained by optimization of realistic cases can be used to attain knowledge about incorporation of externalities as an objective when optimizing traffic systems using DTM measures. A case study for a realistic network of the city of Almelo shows that the objectives efficiency, climate, and air quality are mainly aligned and mainly opposed to traffic safety and noise. However, there is not a single solution that optimizes all three aligned objectives. Based on the Pareto optimal set, the trade-offs are determined and using cluster analysis the solutions and results are further analyzed for network segments.
international conference on networking, sensing and control | 2011
Luc Johannes Josephus Wismans; Eric C. van Berkum; Michiel C.J. Bliemer
In traffic and transport a significant portion of research and application is focused on single objective optimization, although there is rarely only one objective that is of interest. The externalities of traffic are of increasing importance for policy decisions related to the design of a road network. The optimization of externalities using dynamic traffic management measures is a multi objective network design problem. The presence of multiple conflicting objectives makes the optimization problem challenging to solve. Evolutionary multi objective algorithms has been proven successful in solving multi objective optimization problems. However, like all optimization methods, these are subject to the free lunch theorem. Therefore, we compare the NSGAII, SPEA2 and SPEA2+ algorithms in order to find a Pareto optimal solution set for this optimization problem. Because of CPU time limitation as a result of solving the lower level using a dynamic traffic assignment model, the performance by the algorithms is compared within a certain budget. The externalities optimized are noise, climate and accessibility. In a numerical experiment the SPEA2+ outperforms the SPEA2 on all used measures. Comparing NSGAII and SPEA2+, there is no clear evidence of one approach outperforming the other.
Transportation Research Record | 2014
Gijsbert van Eck; Ties Brands; Luc Johannes Josephus Wismans; Adam J. Pel; Rob van Nes
In the aim for a more sustainable transport system, governments try to stimulate multimodal trip making by facilitating smooth transfers between modes. The assessment of related multimodal policy measures requires transport models that are capable of handling the complex nature of multimodality. This complexity sets requirements for adequate modeling of multimodal travel behavior and can be categorized into three classes that are related to the range and combinatorial complexity of the available alternatives, the mathematical complexity of modeling the choice between them, and the complex effect of demand–supply interactions. Classical modeling approaches typically fail to meet these requirements and state-of-the-practice approaches only partly fulfill them. Therefore, the underlying hypothesis of this study was that the application of such models in network design implied an ill-advised decision-making process. Thus, these modeling approaches, as well as the promising state-of-the-research supernetwork approach, were conceptually compared with each other. Requirements for multimodality were constructed, and all three models were tested on the way in which these requirements can be met. The findings of this conceptual comparison were supported by realistic examples in the real-world transport network of the Amsterdam Metropolitan Area in the Netherlands. The theoretical shortcomings of the classical and state-of-the-practice approach were shown to indeed result in implausible predictions of multimodal travel behavior. The flexibility of the supernetwork approach, however, was very capable of describing the expected effect of supply changes on travel behavior in most situations. This study illustrates the urgency for applying sound multimodal modeling approaches in network design studies.
Transitions towards sustainable mobility | 2011
Luc Johannes Josephus Wismans; Eric C. van Berkum; Michiel C.J. Bliemer
Traditionally, traffic problems are treated in isolation in terms of location of the problem and the kind of problem. However, there is a strong correlation between these problems, so clearly solving a traffic problem at one location may result in other problems at other locations. Congestion problems on the main network can, for example, lead to “rat-running” (through-traffic using the secondary road network avoiding these congestion problems) causing liveability problems. Therefore, measures to alleviate traffic problems are nowadays increasingly focussed on the network level. In addition, solutions are sought for the optimization of traffic systems and less emphasis is placed on expanding the infrastructure of the system mainly because of financial considerations and space limitations. This optimization can be achieved using traffic management measures. Traditionally, this type of optimization is focussed on improving accessibility, given particular boundary conditions for traffic safety and liveability (set by law).
Transportmetrica | 2018
Luuk Brederode; Adam J. Pel; Luc Johannes Josephus Wismans; Erik de Romph; Serge P. Hoogendoorn
ABSTRACT This paper describes the road traffic assignment model Static Traffic Assignment with Queuing (STAQ) that was developed for situations where both static (STA) and dynamic (DTA) traffic assignment models are insufficient: strategic applications on large-scale congested networks. The paper demonstrates how the model overcomes shortcomings in STA and DTA modelling approaches in the strategic context by describing its concept, methodology and solution algorithm as well as by presenting model applications on (small) theoretical and (large) real-life networks. The STAQ model captures flow metering and spillback effects of bottlenecks like in DTA models, while its input and computational requirements are only slightly higher than those of STA models. It does so in a very tractable fashion, and acquires high-precision user equilibria (relative gap < 1E-04) on large-scale networks. In light of its accuracy, robustness and accountability, the STAQ model is discussed as a viable alternative to STA and DTA modelling approaches.
LBS 2018: 14th International Conference on Location Based Services | 2018
Joris van den Berg; B. Köbben; Sander van der Drift; Luc Johannes Josephus Wismans
This research combines spatiotemporal traffic and population distribution data in a dynamic isochrone map. To analyze the number of people who have access to a given area or location within a given time, two spatiotemporal variations should ideally be taken into account: (1) variation in travel time, which tend to differ throughout the day as a result of changing traffic conditions, and (2) variation in the location of people, as a result of travel. Typically, accessibility research includes neither one or only variation in travel time. Until recently, we lacked insight in where people were located throughout the day. However, as a result of new data sources like GSM data, the opportunity arises to investigate how variation in traffic conditions and variation in people’s location influences accessibility through space and time. The novelty of this research lies in the combination of spatiotemporal traffic data and spatiotemporal population distribution data presented in a dynamic isochrone web map. A case study is used for the development of this isochrone map. Users can dynamically analyze the areas and people who can reach various home interior stores in the Netherlands within a given time, taking into account traffic conditions and the location of people throughout the day.
Journal of Urban Technology | 2018
Luc Johannes Josephus Wismans; Rein Ahas; Karst Teunis Geurs
Transportation researchers have used GPS data loggers as a supplement and replacement of pen-and-paper surveys since the late 1990s. The use of mobile phone data in transportation studies is more recent; early studies go back a decade or so (e.g., Caceres et al., 2007; González et al., 2008). However, the use of mobile phone data is increasing rapidly. GSM and GPS data generated by phones are used for analysis varying from determining the average speed at certain road sections to gathering revealed preference data of travelers regarding their travel behavior (e.g., mode and route choice). The majority of currently used tracking data is related to mobile phones as the majority of the population is using one, and mobile network coverage is extensive in most countries. Such mobile positioning data features much better geographical and temporal coverage than traditional surveys and counters. Traditional travel diaries and surveys cover only a few days or weeks and provide information about locations related to certain activities. Earlier research showed that traditional travel diary data inherently underreported mobility (see for an overview Schönfelder and Axhausen, 2010) as short-distance and infrequent trips were not reported. Passive GPS and GSM data avoids this bias and covers all of one’s activities and can involve months or even years. The majority of such databases cover significant portions of population and longitudinal time periods. However, this geospatial tracking data also have several shortcomings. For example, typical CDR (Call Detail Record) data include sparsely and irregularly collected points, which are not sufficient for determining the route, speed, and transportation mode. Smartphone-based data contain more information; however, the samples are smaller and the studies cost more. Dedicated smartphone apps can form an alternative data collection method to replace or assist traditional trip diaries. Dedicated apps are also becoming increasingly intelligent in providing user-specific travel advice and feedback, using data science techniques. This opens up opportunities for new research methods in which travelers can be asked for underlying motives and/or personalized incentives can be given while monitoring the choices. Mobile phone-based data collection has developed quickly and despite the shortcomings involved it is used increasingly in science and practice. The trends here are similar to online surveys in sociology: even if traditional data are better and more reliable for the end-user, an increasing number of end-users require new and lower-quality data