Mohsen Rouached
Taif University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mohsen Rouached.
world congress on services | 2012
Mohsen Rouached; Sana Baccar; Mohamed Abid
Due to the large number of sensor manufacturers and differing accompanying protocols, integrating diverse sensors into observation systems is not straightforward. A coherent infrastructure is needed to treat sensors in an interoperable, platform-independent and uniform way. The concept of the Sensor Web reflects such a kind of infrastructure for automatically discovering and accessing appropriate information from heterogeneous sensor devices over the Internet. In this context, the Open Geospatial Consortium (OGC) established the Sensor Web Enablement (SWE) initiative that specifies standard interfaces and encodings to remotely access, encode and exchange the sensed data. However, SWE standards have several gaps that limitate its capabilities to achieve the sensor Web desires. In this paper, we address the problems related to the data format and the architectural style followed by the implementation of the SWE services. Indeed, we propose the adoption of the lightweight Representational State Transfer (REST) web services concept and the usage of JavaScript Object Notation (JSON) format as an alternative to the verbose XML one for the exchanged messages.
International Journal of Information Security and Privacy | 2014
Aymen Akremi; Hassen Sallay; Mohsen Rouached
Intrusion Detection System is considered as a core tool in the collection of forensically relevant evidentiary data in real or near real time from the network. The emergence of High Speed Network (HSN) and Service oriented architecture/Web Services (SOA/WS) putted the IDS in face of a typical big data management problem. The log files that IDS generates are very enormous making very fastidious and both compute and memory intensive the forensics readiness process. Furthermore the high level rate of wrong alerts complicates the forensics expert alert analysis and it disproves its performance, efficiency and ability to select the best relevant evidences to attribute attacks to criminals. In this context, we propose Alert Miner (AM), an intrusion alert classifier, which classifies efficiently in near real-time the intrusion alerts in HSN for Web services. AM uses an outlier detection technique based on an adaptive deduced association rules set to classify the alerts automatically and without human assistance. AM reduces false positive alerts without losing high sensitivity (up to 95%) and accuracy up to (97%). Therefore AM facilitates the alert analysis process and allows the investigators to focus their analysis on the most critical alerts on near real-time scale and to postpone less critical alerts for an off-line log analysis.
web intelligence | 2013
Mohsen Rouached; Nizar Messai
To exploit the true potential of Web services, it is critical to develop technologies and tools for composing new services from existing ones. Indeed, to reduce development time and integration efforts, this process of service composition requires an effective development environment to facilitate quick and simple composition of Web services, and remains a key challenge to realize the true potential of Web services. While numerous composition approaches have been developed, very little has been done towards providing an Integrated Development Environment to ease the process of composition. In this context, this paper introduces a new incremental approach to service composition engineering and considers the composition global life-cycle, i.e. specifying, composing, verifying, deploying, monitoring, and analyzing to achieve a full governance of the composition.
international conference on web services | 2013
Mohsen Rouached; Hassen Sallay
Since current heterogeneous Intrusion Detection Systems (IDSs) have not been designed to work in a cooperative manner, sharing security information among them poses a serious challenge especially in large-scale High Speed Networks (HSN) environment. The integration become more difficult when we should reduce computing and memory costs incurred by the high speed IDSs communication. Fortunately Web Services technology represents a good choice for IDSs integration thanks to its characteristics such as platform transparency and loose coupling. In this context, this paper presents a lightweight RESTful Communication model for coordinating different high speed distributed IDSs. Experimental results show an important gain in terms of data exchanged size and transmission time.
international conference on digital information management | 2013
Mehedi Masud; Mohsen Rouached
There is a growing interest in using Web services as a reliable medium for data sharing among different data providers and users. Recently, enterprises are using service oriented architecture for data sharing in Web by putting data sources behind web services instead of creating database applications. These types of web services are called as Data-providing (DP) Web services. In DP web services there is a challenge to provide a broad spectrum of enterprises the capability to exploit the data and information that is normally stored in distributed and heterogeneous information systems. This paper introduces a model of web service system that integrates distributed data sources and facilitates sharing of data through web services. The web services are built on top of existing data sources and the system enables the exchange of data through services. We also discuss service selection and query rewriting techniques for processing queries over data providing web systems.
business information systems | 2016
Sana Baccar; Mohsen Rouached; Mohamed Abid
Service-oriented architectures SOA foster the integration of different technologies and platforms coming from various enterprises, and bring a new level of flexible modularity that is able to guarantee end-to-end quality of service. However, a current bottleneck in the current process of modelling compositions in SOA is the expert level needed in order to achieve such a composition. This is mainly due to the imperative programming paradigm they are based on. A language such as BPEL is clearly an expert language, and specifying and programming a composition using BPEL is a lengthy, costly, and high risk process. To overcome this limitation, we propose in this paper to use a declarative approach to model services and services compositions. This approach relies on a capabilities service specification, powered by reasoning techniques to handle both functional and non-functional requirements and highly expressive interaction models without over-specifying them. It enables to support flexible and self-managed compositions that are able to adapt to changes that may happen continuously and unpredictably.
workshops on enabling technologies: infrastracture for collaborative enterprises | 2015
Ahmed Abid; Nizar Messai; Mohsen Rouached; Thomas Devogele; Mohamed Abid
Classifying Web services into functionally similar groups is an efficient way to enhance services discovery, composition, and substitution processes. In order to ensure such efficiency, the classification process should rely on adequate semantic similarity measures. This paper presents a practical approach to measure the similarity of Web services. Both semantic and syntactic descriptions are integrated through specific techniques for computing similarity measures between services. Formal Concept Analysis (FCA) is used then to classify Web services into concept lattices in order to facilitate relevant services identification for composition and/or substitution purposes. The proposed similarity measure is evaluated and compared to some of the best-known existing ones. Results show a significant improvement of precision and recall of relevant services discovery for further composition and substitution tasks.
information integration and web-based applications & services | 2015
Imen Khabou; Mohsen Rouached; Mohamed Abid; Rafik Bouaziz
Web service composition has been extensively studied in recent years. Although a lot of new models and mechanisms have been proposed, many issues in service composition still remain unsolved. Among them, privacy is one of the major concerns. Indeed, inheriting characteristics of Web services environments such as high dynamism and untrustworthiness often generate conflicting privacy specifications with respect to the data exchanged within a composition. Even existing technologies for managing and applying data privacy policies are unsuccessful when dealing with this kind of applications as they involve autonomous entities and continuously exchange huge amount of heterogeneous information. This made urgent to have in place effective technologies for data privacy management in service compositions. These technologies should (1) deal with the flexibility, scalability, and heterogeneity in the overall infrastructure in which data are exchanged; and (2) integrate privacy concerns into the development process of these compositions. In this context, this paper tackles the problem of modeling, managing and preserving privacy in Web service composition processes. More specifically, we propose a first step towards providing a privacy preserving Web service composition approach that enables to (i) model and specify privacy policies, preferences, and requirements both at the client and at the provider sides, (ii) enforce the privacy model and build privacy aware compositions, (iii) verify the compliance between users privacy requirements and providers privacy policies, (iv) rank the composite Web services with respect to the privacy level they offer, and (v) provide privacy aware recovery actions to deal with incompatibilities.
information integration and web-based applications & services | 2015
Aymen Akremi; Hassen Sallay; Mohsen Rouached; Rafik Bouaziz; Mohamed Abid
Web service composition has been extensively studied in recent years. Although a lot of new models and mechanisms have been proposed, many issues in service composition still remain unsolved. Among them, forensics examination is one of the major concerns. As opposed to traditional forensics implementations, applying forensics to Web service infrastructures introduces novel problems such as the need for neutrality, comprehensiveness, and reliability. Existing approaches fail to recognize that even optimized strategies for service selection and composition involve the exchange of large amounts of potentially sensitive data, causing potentially serious forensics leaks. Consequently, forensics is still among the key challenges that keep hampering service composition-based solutions. In this context, this paper proposes a built in forensics-aware framework for Web services (Fi4SOA). Fi4SOA uses Sherwood Applied Business Security (SABSA) methodology to merge forensics properties with business requirements at service design phase. It uses reasoning machine over a new proposed ontology to define forensics properties and monitor forensics events at run time phase.
computational intelligence and security | 2015
Aymen Akremi; Hassen Sallay; Mohsen Rouached; Mohamed-Fouad Sriti; Mohamed Abid; Rafik Bouaziz
In this paper, we consider digital forensics in the context of Web services based infrastructures. We propose a built in forensics aware framework called (Fi4SOA). Fi4SOA uses Sherwood Applied Business Security (SABSA) methodology to merge forensics properties with business requirements at service design phase, and a reasoning machine over a new proposed ontology to define forensics properties and monitor forensics events at run time phase.