Anna Liu
Commonwealth Scientific and Industrial Research Organisation
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Anna Liu.
automated software engineering | 2005
John C. Grundy; Yuhong Cai; Anna Liu
Most distributed system specifications have performance benchmark requirements, for example the number of particular kinds of transactions per second required to be supported by the system. However, determining the likely eventual performance of complex distributed system architectures during their development is very challenging. We describe SoftArch/MTE, a software tool that allows software architects to sketch an outline of their proposed system architecture at a high level of abstraction. These descriptions include client requests, servers, server objects and object services, database servers and tables, and particular choices of middleware and database technologies. A fully-working implementation of this system is then automatically generated from this high-level architectural description. This implementation is deployed on multiple client and server machines and performance tests are then automatically run for this generated code. Performance test results are recorded, sent back to the SoftArch/MTE environment and are then displayed to the architect using graphs or by annotating the original high-level architectural diagrams. Architects may change performance parameters and architecture characteristics, comparing multiple test run results to determine the most suitable abstractions to refine to detailed designs for actual system implementation. Further tests may be run on refined architecture descriptions at any stage during system development. We demonstrate the utility of our approach and prototype tool, and the accuracy of our generated performance test-beds, for validating architectural choices during early system development.
automated software engineering | 2001
John C. Grundy; Yuhong Cai; Anna Liu
Most distributed system specifications have performance benchmark requirements. However, determining the likely performance of complex distributed system architectures during development is very challenging. We describe a system where software architects sketch an outline of their proposed system architecture at a high level of abstraction, including indicating client requests, server services, and choosing particular kinds of middleware and database technologies. A fully working implementation of this system is then automatically generated, allowing multiple clients and servers to be run. Performance tests are then automatically run for this generated code and results are displayed back in the original high-level architectural diagrams. Architects may change performance parameters and architecture characteristics, comparing multiple test run results to determine the most suitable abstractions to refine to detailed designs for actual system implementation. We demonstrate the utility of this approach and the accuracy of our generated performance test-beds for validating architectural choices during early system development.
asia-pacific software engineering conference | 2010
Liang Zhao; Anna Liu; Jacky Keung
There is an emergence of Cloud application platforms such as Microsoft’s Azure, Google’s App Engine and Amazon’s EC2/SimpleDB/S3. Startups and Enterprise alike, lured by the promise of ‘infinite scalability’, ‘ease of development’, ‘low infrastructure setup cost’ are increasingly using these Cloud service building blocks to develop and deploy their web based applications. However, the precise nature of these Cloud platforms and the resultant Cloud application runtime behavior is still largely an unknown. Given the black box nature of these platforms, and the novel programming and data models of Cloud, there is a dearth of tools and techniques for enabling the rigorously evaluation of Cloud platforms at runtime. This paper introduces the CARE (Cloud Architecture Runtime Evaluation) approach, a framework for evaluating Cloud application development and runtime platforms. CARE implements a unified interface with WSDL and REST in order to evaluate different Cloud platforms for Cloud application hosting servers and Cloud databases. With the unified interface, we are able to perform selective high stress and low stress evaluations corresponding to desired test scenarios. Result shows the effectiveness of CARE in the evaluation of Cloud variations in terms of scalability, availability and responsiveness, across both compute and storage capabilities. Thus placing CARE as an important tool in the path of Cloud computing research.
IEEE Software | 2012
John C. Grundy; Gerald Kaefer; Jacky Keung; Anna Liu
Cloud computing is a new paradigm for software systems where applications are divided into sets of composite services hosted on leased, highly distributed platforms. There are many new software engineering challenges in building effective cloud-based software applications. This special issue provides a set of practical contributions to the engineering of cloud computing applications and includes software processes, architecture and design approaches, testing, scalability engineering, security engineering, and applications of highly parallel cloud-based systems.
Archive | 2010
Sadeka Islam; Jacky Keung; Kevin Lee; Anna Liu
Archive | 2013
Kevin Lee; Anna Liu
Archive | 2012
Hiroshi Wada; Anna Liu; Jorke Samuel Odolphi; Kevin Lee
Archive | 2000
John Grundy; Anna Liu
Archive | 2014
Kevin Lee; Jorke Samuel Odolphi; Hiroshi Wada; Anna Liu; Vernon Keith Boland
Archive | 2013
Kevin Lee; Anna Liu