Rüdiger Lincke
Linnaeus University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Rüdiger Lincke.
advanced information networking and applications | 2009
Henrike Barkmann; Rüdiger Lincke; Welf Löwe
The validation of software quality metrics lacks statistical significance. One reason for this is that the data collection requires quite some effort. To help solve this problem, we develop tools for metrics analysis of a large number of software projects (146 projects with ca. 70.000 classes and interfaces and over 11 million lines of code). Moreover, validation of software quality metrics should focus on relevant metrics, i.e., correlated metrics need not to be validated independently. Based on our statistical basis, we identify correlation between several metrics from well-known object-oriented metrics suites. Besides, we present early results of typical metrics values and possible thresholds.
software visualization | 2005
Thomas Panas; Rüdiger Lincke; Welf Löwe
A software visualization is defined by an abstract software model, views on this model and a mapping between them. For creating new visualizations, we online-configure views and their mappings instead of hand-coding them. In this paper, we introduce an architecture allowing such an online configuration and, as a proof of concept, a framework implementing this architecture. In several examples, we demonstrate generality and flexibility of our approach.
quality of information and communications technology | 2010
Anna Wingkvist; Morgan Ericsson; Rüdiger Lincke; Welf Löwe
Technical documentation is now fully taking the step from stale printed booklets (or electronic versions of these) to interactive and online versions. This provides opportunities to reconsider how we define and assess the quality of technical documentation. This paper suggests an approach based on the Goal-Question-Metric paradigm: predefined quality goals are continuously assessed and visualized by the use of metrics. To test this approach, we perform two experiments. We adopt well known software analysis techniques, e.g., clone detection and test coverage analysis, and assess the quality of two real world documentations, that of a mobile phone and of (parts of) a warship. The experiments show that quality issues can be identified and that the approach is promising.
international conference on quality software | 2010
Rüdiger Lincke; Tobias Gutzmann; Welf Löwe
Numerous empirical studies confirm that many software metrics aggregated in software quality prediction models are valid predictors for qualities of general interest like maintainability and correctness. Even these general quality models differ quite a bit, which raises the question: Do the differences matter? The goal of our study is to answer this question for a selection of quality models that have previously been published in empirical studies. We compare these quality models statistically by applying them to the same set of software systems, i.e., to altogether 328 versions of 11 open-source software systems. Finally, we draw conclusions from quality assessment using the different quality models, i.e., we calculate a quality trend and compare these conclusions statistically. We identify significant differences among the quality models. Hence, the selection of the quality model has influence on the quality assessment of software based on software metrics.
international conference on software maintenance | 2006
Dennis Strein; Rüdiger Lincke; Jonas Lundberg; Welf Löwe
Software maintenance tools for program analysis and refactoring rely on a metamodel capturing the relevant properties of programs. However, what is considered relevant may change when the tools are extended with new analyses, refactorings, and new programming languages. This paper proposes a language independent metamodel and an architecture to construct instances thereof, which is extensible for new analyses, refactorings, and new front-ends of programming languages. Due to the loose coupling between analysis, refactoring, and front-end components, new components can be added independently and reuse existing ones. Two maintenance tools implementing the metamodel and the architecture, VIZZANALYZER and X-DEVELOP, serve as proof of concept.
international conference on business informatics research | 2010
Anna Wingkvist; Morgan Ericsson; Welf Löwe; Rüdiger Lincke
When a new system, such as a knowledge management system or a content management system is put into production, both the software and hardware are systematically and thoroughly tested while the main purpose of the system — the information — often lacks systemic testing. In this paper we study how to extend testing approaches from software and hardware development to information engineering. We define an information quality testing procedure based on test cases, and provide tools to support testing as well as the analysis and visualization of data collected during the testing. Further, we present a feasibility study where we applied information quality testing to assess information in a documentation system. The results show promise and have been well received by the companies that participated in the feasibility study.
IEEE Transactions on Software Engineering | 2007
Dennis Strein; Rüdiger Lincke; Jonas Lundberg; Welf Löwe
Software maintenance tools for program analysis and refactoring rely on a metamodel capturing the relevant properties of programs. However, what is considered relevant may change when the tools are extended with new analyses, refactorings, and new programming languages. This paper proposes a language independent metamodel and an architecture to construct instances thereof, which is extensible for new analyses, refactorings, and new front-ends of programming languages. Due to the loose coupling between analysis, refactoring, and front-end components, new components can be added independently and reuse existing ones. Two maintenance tools implementing the metamodel and the architecture, VlZZANALYZER and X-DEVELOP, serve as proof of concept.
international symposium on software testing and analysis | 2008
Rüdiger Lincke; Jonas Lundberg; Welf Löwe
4th European Conference on Information Management and Evaluation, Lisbon, Sep. 9-10, 2010 | 2010
Anna Wingkvist; Welf Löwe; Morgan Ericsson; Rüdiger Lincke
annual software engineering workshop | 2005
Thomas Panas; Rüdiger Lincke; Jonas Lundberg; Welf Löwe