Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Anakreon Mentis is active.

Publication


Featured researches published by Anakreon Mentis.


international conference on information intelligence systems and applications | 2013

The Sphinx enigma in critical VoIP infrastructures: Human or botnet?

Dimitris Gritzalis; Yannis Soupionis; Vasilios Katos; Ioannis Psaroudakis; Panajotis Katsaros; Anakreon Mentis

Sphinx was a monster in Greek mythology devouring those who could not solve her riddle. In VoIP, a new service in the role of Sphinx provides protection against SPIT (Spam over Internet Telephony) by discriminating human callers from botnets. The VoIP Sphinx tool uses audio CAPTCHA (Completely Automated Public Turing test to tell Computers and Humans Apart) that are controlled by an anti-SPIT policy mechanism. The design of the Sphinx service has been formally verified for the absence of side-effects in the VoIP services (robustness), as well as for its DoS-resistance. We describe the principles and innovations of Sphinx, together with experimental results from pilot use cases.


Concurrency and Computation: Practice and Experience | 2012

Model checking and code generation for transaction processing software

Anakreon Mentis; Panagiotis Katsaros

In modern transaction processing software, the ACID properties (atomicity, consistency, isolation, durability) are often relaxed, in order to address requirements that arise in computing environments of today. Typical examples are the long‐running transactions in mobile computing, in service‐oriented architectures and B2B collaborative applications. These new transaction models are collectively known as advanced or extended transactions. Formal specification and reasoning for transaction properties have been limited to proof‐theoretic approaches, despite the recent progress in model checking. In this work, we present a model‐driven approach for generating a provably correct implementation of the transaction model of interest. The model is specified by state machines for the transaction participants, which are synchronized on a set of events. All possible execution paths of the synchronized state machines are checked for property violations. An implementation for the verified transaction model is then automatically generated. To demonstrate the approach, the specification of nested transactions is verified, because it is the basis for many advanced transaction models. Concurrency and Computation: Practice and Experience. Copyright


international conference on web services | 2012

Rigorous Analysis of Service Composability by Embedding WS-BPEL into the BIP Component Framework

Emmanouela Stachtiari; Anakreon Mentis; Panagiotis Katsaros

Behavioral correctness of service compositions refers to the absence of service interaction flaws, so that essential service properties like deadlock freedom are preserved and correctness properties related to safety and liveness are assured. Model checking is a widespread technique and it is based on extracting an abstract model representation of the program defining a service orchestration or choreography. During model extraction, the original structure of the service composition cannot be preserved and backwards traceability of the verification findings is not possible. We propose a rigorous analysis within the BIP component framework. Being rigorous means that the analyst is able to reason on which properties hold and why. The BIP language offers a sound execution semantics for a minimal set of primitives and constructs for modeling and composing layered components. We formally define the WS-BPEL 2.0 execution semantics and we provide a structure-preserving translation (embedding) of WS-BPEL to BIP. Structure preservation is feasible, due to the formally grounded expressiveness properties of BIP. As a proof of concept, we apply the developed embedding to a sample BPEL program and present the analysis results for a safety property. By exploiting the BIP model structure we interpret the analysis findings in terms of the service interactions stated in the BPEL source code. A significant benefit of BIP is that it applies compositional reasoning on the model structure to guarantee essential correctness properties and avoid, as much as possible, the scalability limitations of conventional model checking.


Information & Software Technology | 2010

Quantification of interacting runtime qualities in software architectures: Insights from transaction processing in client-server architectures

Anakreon Mentis; Panagiotis Katsaros; Lefteris Angelis; George Kakarontzas

Context: Architecture is fundamental for fulfilling requirements related to the non-functional behavior of a software system such as the quality requirement that response time does not degrade to a point where it is noticeable. Approaches like the Architecture Tradeoff Analysis Method (ATAM) combine qualitative analysis heuristics (e.g. scenarios) for one or more quality metrics with quantitative analyses. A quantitative analysis evaluates a single metric such as response time. However, since quality metrics interact with each other, a change in the architecture can affect unpredictably multiple quality metrics. Objective: This paper introduces a quantitative method that determines the impact of a design change on multiple metrics, thus reducing the risks in architecture design. As a proof of concept, the method is applied on a simulation model of transaction processing in client server architecture. Method: Factor analysis is used to unveil latent (i.e. not directly measurable) quality features represented by new variables that reflect architecture-specific correlations between metrics. Separate Analyses of Variance (ANOVA) are then applied to these variables, for interpreting the tradeoffs detected by factor analysis in terms of the quantified metrics. Results: The results for the examined transaction processing architecture show three latent quality features, the corresponding groups of strongly correlated quality metrics and the impact of architecture characteristics on the latent quality features. Conclusion: The proposed method is a systematic way for relating the variability of quality metrics and the implied tradeoffs to specific architecture characteristics.


simulation tools and techniques for communications, networks and system | 2008

ACID Sim Tools: A Simulation Framework for Distributed Transaction Processing Architectures

Anakreon Mentis; Panagiotis Katsaros; Lefteris Angelis

network centric information systems implement highly distributed architectures that usually include multiple application servers. Application design is mainly based on the fundamental object-oriented principles and the adopted architecture matches the logical decomposition of applications (into several tiers like presentation, logic and data) to their software and hardware structuring. The provided recovery solutions ensure an at-most- once service request processing by an existing transaction processing infrastructure. However, in published works performance evaluation of transaction processing aspects is focused on the computational model of database servers. Also, there are no available tools which enable exploring the performance and availability trade-offs that arise when applying different combinations of concurrency control, atomic commit and recovery protocols. This paper introduces ACID Sim Tools, a publicly available tool and at the same time an open source framework for interactive and batch-mode simulation of transaction processing architectures that adopt the basic assumptions of an object-based computational model.


international conference on high performance computing and simulation | 2009

The ACID model checker and code generator for transaction processing

Anakreon Mentis; Panagiotis Katsaros

Traditional transaction processing aims in delivering the ACID properties (Atomicity, Consistency, Isolation, Durability), that in our days are often relaxed, due to the need for transaction models that suit modern computing environments and workflow management applications. Typical examples are the requirements of long-running transactions in mobile computing or in the web, as well as the requirements of business-to-business collaborative applications. However, there is lack of tools for automatically verifying correctness of transaction model implementations. This work presents the ACID Model Checker and Code Generator, which plays a vital role in developing correct simulation models for the ACID Sim Tools environment. In essence, our contribution introduces an approach for automatically generating provably correct implementations of transaction management, for the transaction model of interest.


software engineering and advanced applications | 2009

Synthetic Metrics for Evaluating Runtime Quality of Software Architectures with Complex Tradeoffs

Anakreon Mentis; Panagiotis Katsaros; Lefteris Angelis

Runtime quality of software, such as availability and throughput, depends on architectural factors and execution environment characteristics (e.g. CPU speed, network latency). Although the specific properties of the underlying execution environment are unknown at design time, the software architecture can be used to assess the inherent impact of the adopted design decisions on runtime quality. However, the design decisions that arise in complex software architectures exhibit non trivial interdependences. This work introduces an approach that discovers the most influential factors, by exploiting the correlation structure of the analyzed metrics via factor analysis of simulation data. A synthetic performance metric is constructed for each group of correlated metrics. The variability of these metrics summarizes the combined factor effects hence it is easier to assess the impact of the analyzed architecture decisions on the runtime quality. The approach is applied on experimental results obtained with the ACID Sim Tools framework for simulating transaction processing architectures.


Lecture Notes in Computer Science | 2014

Secure Migration of Legacy Applications to the Web

Zisis Karampaglis; Anakreon Mentis; Fotios Rafailidis; Paschalis Tsolakidis; Apostolos Ampatzoglou

In software engineering, migration of an application is the process of moving the software from one execution platform to another. Nowadays, many desktop applications tend to migrate to the web or to the cloud. Desktop applications are not prepared to face the hostile environment of the web where applications frequently receive harmful data that attempt to exploit program vulnerabilities such as buffer overflows. We propose a migration process for desktop applications with a text-based user interface, which mitigates existing security concerns and enables the software to perform safely in the web without modifying its of the source code. Additionally, we describe an open source tool that facilitates our migration process.


Simulation Modelling Practice and Theory | 2012

A simulation process for asynchronous event processing systems: Evaluating performance and availability in transaction models

Anakreon Mentis; Panagiotis Katsaros; Lefteris Angelis

Abstract Simulation is essential for understanding the performance and availability behavior of complex systems, but there are significant difficulties when trying to simulate systems with multiple components, which interact with asynchronous communication. A systematic process is needed, in order to cope with the complexity of asynchronous event processing and the failure semantics of the interacting components. We address this problem by introducing an approach that combines formal techniques for faithful representation of the complex system effects and a statistical analysis for simultaneously studying multiple simulation outcomes, in order to interpret them. Our process has been successfully applied to a synthetic workload for distributed transaction processing. We outline the steps followed towards generating a credible simulation model and subsequently we report and interpret the results of the applied statistical analysis. This serves as a proof of concept that the proposed simulation process can be also effective in other asynchronous system contexts, like for example distributed group communication systems, file systems and so on.

Collaboration


Dive into the Anakreon Mentis's collaboration.

Top Co-Authors

Avatar

Panagiotis Katsaros

Aristotle University of Thessaloniki

View shared research outputs
Top Co-Authors

Avatar

Lefteris Angelis

Aristotle University of Thessaloniki

View shared research outputs
Top Co-Authors

Avatar

Dimitris Gritzalis

Athens University of Economics and Business

View shared research outputs
Top Co-Authors

Avatar

Emmanouela Stachtiari

Aristotle University of Thessaloniki

View shared research outputs
Top Co-Authors

Avatar

Fotios Rafailidis

Aristotle University of Thessaloniki

View shared research outputs
Top Co-Authors

Avatar

George Kakarontzas

Aristotle University of Thessaloniki

View shared research outputs
Top Co-Authors

Avatar

Ioannis Psaroudakis

Democritus University of Thrace

View shared research outputs
Top Co-Authors

Avatar

Panajotis Katsaros

Aristotle University of Thessaloniki

View shared research outputs
Top Co-Authors

Avatar

Yannis Soupionis

Athens University of Economics and Business

View shared research outputs
Top Co-Authors

Avatar

Zisis Karampaglis

Aristotle University of Thessaloniki

View shared research outputs
Researchain Logo
Decentralizing Knowledge