John Venable
Curtin University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by John Venable.
Lecture Notes in Computer Science | 2012
John Venable; Jan Pries-Heje; Richard Baskerville
Evaluation is a central and essential activity in conducting rigorous Design Science Research (DSR), yet there is surprisingly little guidance about designing the DSR evaluation activity beyond suggesting possible methods that could be used for evaluation. This paper extends the notable exception of the existing framework of Pries-Heje et al [11] to address this problem. The paper proposes an extended DSR evaluation framework together with a DSR evaluation design method that can guide DSR researchers in choosing an appropriate strategy for evaluation of the design artifacts and design theories that form the output from DSR. The extended DSR evaluation framework asks the DSR researcher to consider (as input to the choice of the DSR evaluation strategy) contextual factors of goals, conditions, and constraints on the DSR evaluation, e.g. the type and level of desired rigor, the type of artifact, the need to support formative development of the designed artifacts, the properties of the artifact to be evaluated, and the constraints on resources available, such as time, labor, facilities, expertise, and access to research subjects. The framework and method support matching these in the first instance to one or more DSR evaluation strategies, including the choice of ex ante (prior to artifact construction) versus ex post evaluation (after artifact construction) and naturalistic (e.g., field setting) versus artificial evaluation (e.g., laboratory setting). Based on the recommended evaluation strategy(ies), guidance is provided concerning what methodologies might be appropriate within the chosen strategy(ies).
design science research in information systems and technology | 2009
Richard Baskerville; Jan Pries-Heje; John Venable
This paper proposes and evaluates a soft systems approach to design science research. Soft Design Science provides an approach to the development of new ways to improve human organizations, especially with consideration for social aspects, through the activities of design, development, instantiation, evaluation and evolution of a technological artifact. The Soft Design Science approach merges the common design science research process (design, build-artifact, evaluation) together with the iterative soft systems methodology. The design-build artifact-evaluation process is iterated until the specific requirements are met. The generalized requirements are adjusted as the process continues to keep alignment with the specific requirements. In the end, the artifact represents a general solution to a class of problems shown to operate in one instance of that class of problems. The proposed methodology is evaluated by an analysis of how it differs from, and could have informed and improved, a published design science study, which used a design-oriented action research method.
European Journal of Information Systems | 2016
John Venable; Jan Pries-Heje; Richard Baskerville
Evaluation of design artefacts and design theories is a key activity in Design Science Research (DSR), as it provides feedback for further development and (if done correctly) assures the rigour of the research. However, the extant DSR literature provides insufficient guidance on evaluation to enable Design Science Researchers to effectively design and incorporate evaluation activities into a DSR project that can achieve DSR goals and objectives. To address this research gap, this research paper develops, explicates, and provides evidence for the utility of a Framework for Evaluation in Design Science (FEDS) together with a process to guide design science researchers in developing a strategy for evaluating the artefacts they develop within a DSR project. A FEDS strategy considers why, when, how, and what to evaluate. FEDS includes a two-dimensional characterisation of DSR evaluation episodes (particular evaluations), with one dimension being the functional purpose of the evaluation (formative or summative) and the other dimension being the paradigm of the evaluation (artificial or naturalistic). The FEDS evaluation design process is comprised of four steps: (1) explicate the goals of the evaluation, (2) choose the evaluation strategy or strategies, (3) determine the properties to evaluate, and (4) design the individual evaluation episode(s). The paper illustrates the framework with two examples and provides evidence of its utility via a naturalistic, summative evaluation through its use on an actual DSR project.
Journal of Information Technology | 2010
William Newk-Fon Hey Tow; Peter Dell; John Venable
The advent of social networking websites presents further opportunities for criminals to obtain information for use in identity theft, cyber-stalking, and worse activities. This paper presents research investigating why users of social networking websites willingly disclose personal information and what sorts of information they provide (or not). The study employed an ethnographic approach of participation in the online community and interviews of community members, combined with a quantitative survey. The findings show that users are often simply not aware of the issues or feel that the risk to them personally is very low. The paper develops a preliminary theoretical model to explain the information disclosure phenomenon. It further recommends that government agencies or social networking websites themselves conduct campaigns to inform the public of these issues and that social networking websites consider removing some facilities. The study was conducted in an Australian context and focussed on the popular Facebook website.
DESRIST'10 Proceedings of the 5th international conference on Global Perspectives on Design Science Research | 2010
John Venable
There is ongoing debate about how the quality (rigour and relevance) of Design Science Research (DSR) should be judged This research investigates the state of the debate by surveying the opinions of IS scholars who write, review, edit, and publish DSR papers The survey respondents rated the relative importance of the seven guidelines (often used as evaluation criteria) laid out in Hevner et al (2004) [6], more specific criteria about the evaluation activity in DSR, criteria concerning IS Design Theories, and miscellaneous other criteria, and made general open-ended comments The findings indicate a lack of consensus, with much variability in ratings The Hevner et al [6] guidelines are largely endorsed, but caution is also raised to apply them less mechanistically than at present Some criteria/guidelines are seen to be less important at earlier stages of research Caution is also urged not to expect single papers to fit all criteria/guidelines.
Information & Management | 2014
Michael D. Myers; John Venable
Over the past decade, design science research (DSR) has re-emerged as an important research paradigm in the field of information systems. However, the approaches currently recommended for conducting design science research do not include an ethical component. Thus, the objective of this paper is to initiate a debate about the need for ethical principles for DSR in Information Systems (IS). To launch this debate, we suggest that a set of ethical principles for DSR in IS must be created. Although the interpretation and application of these principles might not always be straightforward, our argument is that all DSR practitioners in IS should devote at least some time to consider ethical principles.
Archive | 2010
Jan Pries-Heje; John Venable; Deborah Bunker; Nancy L. Russo; Janice I. DeGross
This book constitutes the proceedings of the 2010 Joint International Working C- ference of the International Federation for Information Processing Working Groups 8.2 and 8.6. Both working groups are part of IFIP Technical Committee 8, the tech- cal committee addressing the field of Information Systems. IFIP WG 8.2, the Inter- tion of Information Systems and Organizations, was established in 1977. IFIP WG 8.6, Diffusion, Transfer and Implementation of Information Technology, was est- lished in 1994. In accordance with their respective themes, both IFIP WG 8.2 and IFIP WG 8.6 have long had an interest in the human impact of information systems. In December 1998, they held a joint working conference in Helsinki, Finland, on the theme Inf- mation Systems: Current Issues and Future Challenges. The two working groups joint interest in and collaboration on research concerning the human side of IS is c- tinued and extended through this joint working conference, held on the campus of Curtin University of Technology, from March 30 to April 1, 2010, in Perth, Western Australia. This conference, Human Benefit Through the Diffusion of Information Systems Design Science Research, combines the traditional themes of the two working groups with the growing interest within the IS research field in the area of design science research.
CreativeSME | 2009
John Venable
This paper utilises the Critical Systems Heuristics (CSH) framework developed by Werner Ulrich to critically consider the stakeholders and design goals that should be considered as relevant by researchers conducing Design Science Research (DSR). CSH provides a philosophically and theoretically grounded framework and means for critical consideration of the choices of stakeholders considered to be relevant to any system under design consideration. The paper recommends that legitimately undertaken DSR should include witnesses to represent the interests of the future consumers of the outcomes of DSR, i.e., the future clients, decision makers, professionals, and other non-included stakeholders in the future use of the solution technologies to be invented in DSR. The paper further discusses options for how witnesses might be included, who should be witnessed for and obstacles to implementing the recommendations.
Information Technology & People | 2011
John Venable; Jan Pries-Heje; Deborah Bunker; Nancy L. Russo
Purpose – This paper aims to introduce this special issue of ITP on systems for human benefit (S4HB), to develop and promote the idea of S4HB, and advocate that more research be conducted on the design and diffusion of S4HB.Design/methodology/approach – This conceptual paper argues that S4HB are systemically under‐researched based on a historical perspective on IS research and proposes an agenda for research on the design and diffusion of S4HB.Findings – The paper identifies extant areas of S4HB, such as health and education, but also advocates that new areas of S4HB be identified and new kinds of S4HB be designed. It further discusses how diffusion is a key issue to the realisation of human benefits and contrasts diffusion of S4HB with more commercial business systems as a motivator for further research. Finally it sets out a brief agenda for research in S4HB, including: development of a vision for research on S4HB that emphasises design for solving human problems; research on diffusion of S4HB; revision...
hawaii international conference on system sciences | 2011
Richard Baskerville; Jan Pries-Heje; John Venable
Design Science Research (DSR) is a complex form of research that combines very heterogeneous activities requiring different skills with more elaborate areas of risk to manage. As yet, there is little experience with managing risk in DSR or even identification of types of risks to be managed. This paper analyses DSR research activities and elaborates known principles and practices of risk management for DSR and develops a framework for identifying, assessing, prioritizing, and treating potential risks inherent to DSR. Potential users of the framework include experienced and especially novice DSR researchers. The framework classifies six potential risk areas and enumerates specific key risks within each area. It includes risk assessment and treatment models. Finally the paper applies the framework to an ongoing DSR case study to provide initial evidence of its value and feasibility.