Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Martin Nowack is active.

Publication


Featured researches published by Martin Nowack.


european conference on computer systems | 2010

Evaluation of AMD's advanced synchronization facility within a complete transactional memory stack

Dave Christie; Jaewoong Chung; Stephan Diestelhorst; Michael P. Hohmuth; Martin T. Pohlack; Christof Fetzer; Martin Nowack; Torvald Riegel; Pascal Felber; Patrick Marlier; Etienne Rivière

AMDs Advanced Synchronization Facility (ASF) is an x86 instruction set extension proposal intended to simplify and speed up the synchronization of concurrent programs. In this paper, we report our experiences using ASF for implementing transactional memory. We have extended a C/C++ compiler to support language-level transactions and generate code that takes advantage of ASF. We use a software fallback mechanism for transactions that cannot be committed within ASF (e.g., because of hardware capacity limitations). Our evaluation uses a cycle-accurate x86 simulator that we have extended with ASF support. Building a complete ASF-based software stack allows us to evaluate the performance gains that a user-level program can obtain from ASF. Our measurements on a wide range of benchmarks indicate that the overheads traditionally associated with software transactional memories can be significantly reduced with the help of ASF.


international symposium on microarchitecture | 2010

The Velox Transactional Memory Stack

Pascal Felber; E Rivière; W M Moreira; Derin Harmanci; Patrick Marlier; Stephan Diestelhorst; Michael P. Hohmuth; Martin T. Pohlack; Adrian Cristal; I Hur; Osman S. Unsal; P Stenström; A Dragojevic; Rachid Guerraoui; M Kapalka; Vincent Gramoli; U Drepper; S Tomić; Yehuda Afek; Guy Korland; Nir Shavit; Christof Fetzer; Martin Nowack; Torvald Riegel

The adoption of multi- and many-core architectures for mainstream computing undoubtedly brings profound changes in the way software is developed. In particular, the use of fine grained locking as the multi-core programmers coordination methodology is considered by more and more experts as a dead-end. The transactional memory (TM) programming paradigm is a strong contender to become the approach of choice for replacing locks and implementing atomic operations in concurrent programming. Combining sequences of concurrent operations into atomic transactions allows a great reduction in the complexity of both programming and verification, by making parts of the code appear to execute sequentially without the need to program using fine-grained locking. Transactions remove from the programmer the burden of figuring out the interaction among concurrent operations that happen to conflict when accessing the same locations in memory. The EU-funded FP7 VELOX project designs, implements and evaluates an integrated TM stack, spanning from programming language to the hardware support, and including runtime and libraries, compilers, and application environments. This paper presents an overview of the VELOX TM stack and its associated challenges and contributions.


international symposium on stabilization safety and security of distributed systems | 2009

Speculation for Parallelizing Runtime Checks

Martin Süßkraut; Stefan Weigert; Ute Schiffel; Thomas Knauth; Martin Nowack; Diogo Becker de Brum; Christof Fetzer

We present and evaluate a framework, ParExC , to reduce the runtime penalties of compiler generated runtime checks. An obvious approach is to use idle cores of modern multi-core CPUs to parallelize the runtime checks. This could be accomplished by (a) parallelizing the application and in this way, implicitly parallelizing the checks, or (b) by parallelizing the checks only. Parallelizing an application is rarely easy and frameworks that simplify the parallelization, e.g., like software transactional memory (STM), can introduce considerable overhead. ParExC is based on alternative (b). We compare it with an approach using a transactional memory-based alternative. Our experience shows that ParExC is not only more efficient than the STM-based solution but the manual effort for an application developer to integrate ParExC is lower. ParExC has --- in contrast to similar frameworks --- two noteworthy features that permit a more efficient parallelization of checks: (1) speculative variables, and (2) the ability to add checks by static instrumentation.


International Journal of Life Cycle Assessment | 2012

Review and downscaling of life cycle decision support tools for the procurement of low-value products

Martin Nowack; Holger Hoppe; Edeltraud Guenther

PurposeIn this article, we analyze how environmental aspects can be derived from life cycle management instruments for procurement decisions of low-value products. For our analysis, we chose the case of operating room textiles. The review includes the life cycle management instruments: life cycle assessment, environmental labels, and management systems applied within the textile industry. We do so in order to identify the most important environmental decision criteria based on which the procurer of low-value products can decide for the most environmentally friendly option.MethodsWe conducted a systematic literature search in the relevant literature databases. We critically evaluate the identified life cycle assessment studies for sound methodology, verifiability, completeness, and actuality. Based on this review, we analyze and compare the results of the three most comprehensive studies in more detail and derive the most important environmental aspects of operating room textiles. In a second step, we extend the operational perspective via the strategic perspective, namely environmental management systems and further existing life cycle instruments such as eco-labels. We then synthesize the gathered information into a decision vector. Finally, we discuss how the gathered data can be further exploited and give suggestions for a more sophisticated assessment.Results and discussionThe review of the existing life cycle assessments on operating room textiles showed that procurers should not base their decisions exclusively on existing life cycle assessments. In addition to problems such as methodological weakness, incompleteness, outdated data, and poor verifiability, the information provided is far too complex to prepare procurement decisions regarding low-value products. Furthermore, the results for the textiles assessed in the existing life cycle assessments are not necessarily transferrable to the textiles considered by the procurer because of restrictive assumptions. Therefore, it is necessary to downscale the available information and synthesize it in an applicable decision support tool. Our decision vector consists of the key environmental aspects water, CO2, energy, and waste and is completed by environmental management systems, eco-labels, and the countries of origin that matters for environmental and social aspects as well.ConclusionsThe decision vector supports procurers when considering environmental aspects in procurement decisions and provides a mechanism for balancing the information between overcomplexity and oversimplification. Therefore, it should be the basis for future development of an eco-label for operating room textiles.


acm symposium on parallel algorithms and architectures | 2013

Brief announcement: between all and nothing - versatile aborts in hardware transactional memory

Stephan Diestelhorst; Martin Nowack; Michael F. Spear; Christof Fetzer

Hardware Transactional Memory (HTM) implementations are becoming available in commercial, off-the-shelf components. While generally comparable, some implementations deviate from the strict all-or-nothing property of pure Transactional Memory. We analyse these deviations and find that with small modifications, they can be used to accelerate and simplify both transactional and non-transactional programming constructs. At the heart of our extensions we enable access to the transactions full register state in the abort handler in an existing HTM without extending the architectural register state. Access to the full register state enables applications in both transactional and non-transactional parallel programming: hybrid transactional memory; transactional escape actions; transactional suspend/resume; and alert-on-update.


Transactional Memory | 2015

Safe Exception Handling with Transactional Memory

Pascal Felber; Christof Fetzer; Vincent Gramoli; Derin Harmanci; Martin Nowack

Exception handling is notoriously difficult for programmers whereas transactional memory has been instrumental in simplifying concurrent programming. In this chapter, we describe how the transactional syntactic sugar simplifies the exception handling problems both when writing sequential and concurrent applications. We survey exception handling solutions to prevent applications from reaching an inconsistent state in a sequential environment on the one hand, and extend these solutions to also prevent concurrent execution of multiple threads from reaching an inconsistent state, on the other hand. The resulting technique greatly simplifies exception handling and is shown surprisingly efficient.


haifa verification conference | 2015

Parallel Symbolic Execution: Merging In-Flight Requests

Martin Nowack; Katja Tietze; Christof Fetzer

The strength of symbolic execution is the systematic analysis and validation of all possible control flow paths of a program and their respective properties, which is done by use of a solver component. Thus, it can be used for program testing in many different domains, e.g. test generation, fault discovery, information leakage detection, or energy consumption analysis. But major challenges remain, notably the huge (up to infinite) number of possible paths and the high computation costs generated by the solver to check the satisfiability of the constraints imposed by the paths. To tackle these challenges, researchers proposed the parallelization of symbolic execution by dividing the state space and handling the parts independently. Although this approach scales out well, we can further improve it by proposing a thread-based parallelized approach. It allows us to reuse shared resources like caches more efficiently – a vital part to reduce the solving costs. More importantly, this architecture enables us to use a new technique, which merges parallel incoming solver requests, leveraging incremental solving capabilities provided by modern solvers. Our results show a reduction of the solver time up to 50 % over the multi-threaded execution.


Archive | 2011

How Does Emissions Trading Influence Corporate Risk Management

Edeltraud Günther; Martin Nowack; Gabriel Weber

The purpose of this chapter is to investigate regulatory climate change risks related to emissions trading. We propose that a deeper integration of climate change risks in risk management is necessary. Therefore we derive a six-step risk management process according to Draft International Standard for ISO 31000. We argue that this formalized risk management standard is a useful tool to integrate climate change risks in risk management. Following this approach we examine the interconnections between emissions trading and corporate risk management based on a case study conducted in the multinational energy company Vattenfall. We apply a content analysis of Vattenfall’s publicly available risk reports as part of the annual reporting. In this chapter we find that Vattenfall’s exposure to climate change risks is high. Major climate change related risks are the electricity price risk, political risk, investment risk, and environmental risk. This work adds to existing literature dealing with carbon disclosure. By focusing on physical climate change risks as well as risks related to emissions trading we propose to reduce the existing gaps in the literature.


Praxis Der Wirtschaftsinformatik | 2010

Klimarisikomanagement mit dem CO 2 -Navigator.

Edeltraud Günther; Christian Manthey; Gabriel Weber; Martin Nowack; Henry Dannenberg; Wilfried Ehrenfeld

ZusammenfassungDie Software CO2-Navigator wendet den Realoptionsansatz und den Risikomanagementprozess auf den unternehmerischen Umgang mit dem Klimawandel an. Er richtet sich in erster Linie an emissionsintensive, kleine und mittlere Unternehmen, ist jedoch auch in gröβeren Unternehmen, die beispielsweise eine eigene »Sustainability«-Abteilung unterhalten, anwendbar. Der Wertbeitrag des Softwaretools besteht darin, dass es die Aspekte Klimastrategie, quantitative Bewertung von Klimaschutzinvestitionen sowie Emissionsrechtemanagement vereint. Es kann im Unternehmen in Bereichen strategisches Management, regulatorisches Management, Energie- und Umweltmanagement, Technologiemanagement sowie Controlling Anwendung finden. Der Beitrag dieses Artikels liegt in der Verknüpfung von Klimarisikomanagement und Realoptionsansatz sowie in der Darstellung des CO2-Navigators vor dem Hintergrund seiner Entwicklung im Sinne konstruktionsorientierter Forschung.


Technological Forecasting and Social Change | 2011

Review of Delphi-Based Scenario Studies: Quality and Design Considerations

Martin Nowack; Jan Endrikat; Edeltraud Guenther

Collaboration


Dive into the Martin Nowack's collaboration.

Top Co-Authors

Avatar

Christof Fetzer

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar

Edeltraud Günther

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar

Pascal Felber

University of Neuchâtel

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gabriel Weber

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar

Torvald Riegel

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Derin Harmanci

University of Neuchâtel

View shared research outputs
Top Co-Authors

Avatar

Diogo Becker de Brum

Dresden University of Technology

View shared research outputs
Top Co-Authors

Avatar

Edeltraud Guenther

Dresden University of Technology

View shared research outputs
Researchain Logo
Decentralizing Knowledge