Yehia Elkhatib
Lancaster University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Yehia Elkhatib.
international conference on computer communications and networks | 2012
Gareth Tyson; Sebastian Kaune; Simon Miles; Yehia Elkhatib; Andreas Mauthe; Adel Taweel
A content-centric network is one which supports host-to-content routing, rather than the host-to-host routing of the existing Internet. This paper investigates the potential of caching data at the router-level in content-centric networks. To achieve this, two measurement sets are combined to gain an understanding of the potential caching benefits of deploying content-centric protocols over the current Internet topology. The first set of measurements is a study of the BitTorrent network, which provides detailed traces of content request patterns. This is then combined with CAIDAs ITDK Internet traces to replay the content requests over a real-world topology. Using this data, simulations are performed to measure how effective content-centric networking would have been if it were available to these consumers/providers. We find that larger cache sizes (10,000 packets) can create significant reductions in packet path lengths. On average, 2.02 hops are saved through caching (a 20% reduction), whilst also allowing 11% of data requests to be maintained within the requesters AS. Importantly, we also show that these benefits extend significantly beyond that of edge caching by allowing transit ASes to also reduce traffic.
acm special interest group on data communication | 2013
Panagiotis Georgopoulos; Yehia Elkhatib; Matthew Broadbent; Mu Mu; Nicholas J. P. Race
Video streaming is an increasingly popular way to consume media content. Adaptive video streaming is an emerging delivery technology which aims to increase user QoE and maximise connection utilisation. Many implementations naively estimate bandwidth from a one-sided client perspective, without taking into account other devices in the network. This behaviour results in unfairness and could potentially lower QoE for all clients. We propose an OpenFlow-assisted QoE Fairness Framework that aims to fairly maximise the QoE of multiple competing clients in a shared network environment. By leveraging a Software Defined Networking technology, such as OpenFlow, we provide a control plane that orchestrates this functionality. The evaluation of our approach in a home networking scenario introduces user-level fairness and network stability, and illustrates the optimisation of QoE across multiple devices in a network.
Environmental Modelling and Software | 2015
Claudia Vitolo; Yehia Elkhatib; Dominik E. Reusser; C. J. A. Macleod; Wouter Buytaert
Recent evolutions in computing science and web technology provide the environmental community with continuously expanding resources for data collection and analysis that pose unprecedented challenges to the design of analysis methods, workflows, and interaction with data sets. In the light of the recent UK Research Council funded Environmental Virtual Observatory pilot project, this paper gives an overview of currently available implementations related to web-based technologies for processing large and heterogeneous datasets and discuss their relevance within the context of environmental data processing, simulation and prediction. We found that, the processing of the simple datasets used in the pilot proved to be relatively straightforward using a combination of R, RPy2, PyWPS and PostgreSQL. However, the use of NoSQL databases and more versatile frameworks such as OGC standard based implementations may provide a wider and more flexible set of features that particularly facilitate working with larger volumes and more heterogeneous data sources. We review web service related technologies to manage, transfer and process Big Data.We examine international standards and related implementations.Many existing algorithms can be easily exposed as services and cloud-enabled.The adoption of standards facilitate the implementation of workflows.Use of web technologies to tackle environmental issues is acknowledged worldwide.
Networking Conference, 2014 IFIP | 2014
Yehia Elkhatib; Gareth Tyson; Michael Welzl
HTTP is a successful Internet technology on top of which a lot of the web resides. However, limitations with its current specification have encouraged some to look for the next generation of HTTP. In SPDY, Google has come up with such a proposal that has growing community acceptance, especially after being adopted by the IETF HTTPbis-WG as the basis for HTTP/2.0. SPDY has the potential to greatly improve web experience with little deployment overhead, but we still lack an understanding of its true potential in different environments. This paper offers a comprehensive evaluation of SPDYs performance using extensive experiments. We identify the impact of network characteristics and website infrastructure on SPDYs potential page loading benefits, finding that these factors are decisive for an optimal SPDY deployment strategy. Through exploring such key aspects that affect SPDY, and accordingly HTTP/2.0, we feed into the wider debate regarding the impact of future protocols.
F1000Research | 2017
Jonathan P. Tennant; Jonathan M. Dugan; Daniel Graziotin; Damien Christophe Jacques; François Waldner; Daniel Mietchen; Yehia Elkhatib; Lauren Brittany Collister; Christina K. Pikas; Tom Crick; Paola Masuzzo; Anthony Caravaggi; Devin R. Berg; Kyle E. Niemeyer; Tony Ross-Hellauer; Sara Mannheimer; Lillian Rigling; Daniel S. Katz; Bastian Greshake Tzovaras; Josmel Pacheco-Mendoza; Nazeefa Fatima; Marta Poblet; Marios Isaakidis; Dasapta Erwin Irawan; Sébastien Renaut; Christopher R. Madan; Lisa Matthias; Jesper Nørgaard Kjær; Daniel Paul O'Donnell; Cameron Neylon
Peer review of research articles is a core part of our scholarly communication system. In spite of its importance, the status and purpose of peer review is often contested. What is its role in our modern digital research and communications infrastructure? Does it perform to the high standards with which it is generally regarded? Studies of peer review have shown that it is prone to bias and abuse in numerous dimensions, frequently unreliable, and can fail to detect even fraudulent research. With the advent of Web technologies, we are now witnessing a phase of innovation and experimentation in our approaches to peer review. These developments prompted us to examine emerging models of peer review from a range of disciplines and venues, and to ask how they might address some of the issues with our current systems of peer review. We examine the functionality of a range of social Web platforms, and compare these with the traits underlying a viable peer review system: quality control, quantified performance metrics as engagement incentives, and certification and reputation. Ideally, any new systems will demonstrate that they out-perform current models while avoiding as many of the biases of existing systems as possible. We conclude that there is considerable scope for new peer review initiatives to be developed, each with their own potential issues and advantages. We also propose a novel hybrid platform model that, at least partially, resolves many of the technical and social issues associated with peer review, and can potentially disrupt the entire scholarly communication system. Success for any such development relies on reaching a critical threshold of research community engagement with both the process and the platform, and therefore cannot be achieved without a significant change of incentives in research environments.
european conference on computer systems | 2013
Yehia Elkhatib; Gordon S. Blair; Bholanathsingh Surajbali
Environmental science is often fragmented: data is collected using mismatched formats and conventions, and models are misaligned and run in isolation. Cloud computing offers a lot of potential in the way of resolving such issues by supporting data from different sources and at various scales, by facilitating the integration of models to create more sophisticated software services, and by providing a sustainable source of suitable computational and storage resources. In this paper, we highlight some of our experiences in building the Environmental Virtual Observatory pilot (EVOp), a tailored cloud-based infrastructure and associated web-based tools designed to enable users from different backgrounds to access data concerning different environmental issues. We review our architecture design, the current deployment and prototypes. We also reflect on lessons learned. We believe that such experiences are of benefit to other scientific communities looking to assemble virtual observatories or similar virtual research environments.
local computer networks | 2009
Thomas Bocek; Fabio Victora Hecht; David Hausheer; Burkhard Stiller; Yehia Elkhatib
Incentive schemes in Peer-to-Peer (P2P) networks are necessary to discourage free-riding. One example is the Tit-for-Tat (TFT) incentive scheme, a variant of which is used in BitTorrent to encourage peers to upload. TFT uses data from local observations making it suitable for systems with direct reciprocity. This paper presents CompactPSH, an incentive scheme that works with direct and indirect reciprocity. CompactPSH allows peers to establish indirect reciprocity by finding intermediate peers, thus enabling trade with more peers and capitalizing on more resources. CompactPSH finds transitive paths while keeping the overhead of additional messages low. In a P2P file-sharing scenario based on input data from a large BitTorrent tracker, CompactPSH was found to exploit more reciprocity than TFT which enabled more chunks to be downloaded. As a consequence, peers are allowed to be stricter to fight white-washing without compromising performance.
adaptive and reflective middleware | 2015
Gordon S. Blair; Yérom-David Bromberg; Geoff Coulson; Yehia Elkhatib; Laurent Réveillère; Heverson Borba Ribeiro; Etienne Rivière; François Taïani
The worlds computing infrastructure is increasingly differentiating into self-contained distributed systems with various purposes and capabilities (e.g. IoT installations, clouds, VANETs, WSNs, CDNs, . . .). Furthermore, such systems are increasingly being composed to generate systems of systems that offer value-added functionality. Today, however, system of systems composition is typically ad-hoc and fragile. It requires developers to possess an intimate knowledge of system internals and low-level interactions between their components. In this paper, we outline a vision and set up a research agenda towards the generalised programmatic construction of distributed systems as compositions of other distributed systems. Our vision, in which we refer uniformly to systems and to compositions of systems as holons, employs code generation techniques and uses common abstractions, operations and mechanisms at all system levels to support uniform system of systems composition. We believe our holon approach could facilitate a step change in the convenience and correctness with which systems of systems can be built, and open unprecedented opportunities for the emergence of new and previously-unenvisaged distributed system deployments, analogous perhaps to the impact the mashup culture has had on the way we now build web applications.
Environmental Modelling and Software | 2015
Sheila Greene; Penny J Johnes; John P. Bloomfield; S. M. Reaney; Russell Lawley; Yehia Elkhatib; Jim E Freer; Nick Odoni; C. J. A. Macleod; Barbara Percy
Anthropogenic impacts on the aquatic environment, especially in the context of nutrients, provide a major challenge for water resource management. The heterogeneous nature of policy relevant management units (e.g. catchments), in terms of environmental controls on nutrient source and transport, leads to the need for holistic management. However, current strategies are limited by current understanding and knowledge that is transferable between spatial scales and landscape typologies. This study presents a spatially-explicit framework to support the modelling of nutrients from land to water, encompassing environmental and spatial complexities. The framework recognises nine homogeneous landscape units, distinct in terms of sensitivity of nutrient losses to waterbodies. The functionality of the framework is demonstrated by supporting an exemplar nutrient model, applied within the Environmental Virtual Observatory pilot (EVOp) cloud cyber-infrastructure. We demonstrate scope for the use of the framework as a management decision support tool and for further development of integrated biogeochemical modelling. High resolution geospatial biogeochemical framework for modelling nutrient flux.Nitrogen and phosphorus modeled across many spatial scales in the United Kingdom.Upscales from grid to river catchment, regional and national scale.Knowledge transfer from data-rich research catchments to data-poor areas.Many other biogeochemical models can be fit to the framework.
network operations and management symposium | 2016
Faiza Samreen; Yehia Elkhatib; Matthew Rowe; Gordon S. Blair
Decision making in cloud environments is quite challenging due to the diversity in service offerings and pricing models, especially considering that the cloud market is an incredibly fast moving one. In addition, there are no hard and fast rules; each customer has a specific set of constraints (e.g. budget) and application requirements (e.g. minimum computational resources). Machine learning can help address some of the complicated decisions by carrying out customer-specific analytics to determine the most suitable instance type(s) and the most opportune time for starting or migrating instances. We employ machine learning techniques to develop an adaptive deployment policy, providing an optimal match between the customer demands and the available cloud service offerings. We provide an experimental study based on extensive set of job executions over a major public cloud infrastructure.