Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Ahmed Patel is active.

Publication


Featured researches published by Ahmed Patel.


Journal of Network and Computer Applications | 2013

Review: An intrusion detection and prevention system in cloud computing: A systematic review

Ahmed Patel; Mona Taghavi; Kaveh Bakhtiyari; Joaquim Celestino Júnior

The distributed and open structure of cloud computing and services becomes an attractive target for potential cyber-attacks by intruders. The traditional Intrusion Detection and Prevention Systems (IDPS) are largely inefficient to be deployed in cloud computing environments due to their openness and specific essence. This paper surveys, explores and informs researchers about the latest developed IDPSs and alarm management techniques by providing a comprehensive taxonomy and investigating possible solutions to detect and prevent intrusions in cloud computing systems. Considering the desired characteristics of IDPS and cloud computing systems, a list of germane requirements is identified and four concepts of autonomic computing self-management, ontology, risk management, and fuzzy theory are leveraged to satisfy these requirements.


ieee international conference on digital ecosystems and technologies | 2010

Comparative review study of reactive and proactive routing protocols in MANETs

Shima Mohseni; Rosilah Hassan; Ahmed Patel; Rozilawati Razali

Mobile Ad Hoc Networks (MANETs) are generating a lot of interests due to 3G and 4G activities. The dynamic nature of these networks demands new set of network routing strategy protocols to be implemented in order to provide efficient end-to-end communication. Due to the diverse applications that use MANETs, such as battlefield, emergency services, and disaster discovery, MANETs offer many advantages to many organizations that need wireless roaming. For efficient and timely use, routing and synchronization are essential. Both are hot research topics in MANETs. This paper concentrates on routing, which is a challenging task and has seen a huge number of different strategies proposed, each claiming to provide an improvement over other strategies. These competing strategies make it quite difficult to determine which one may perform optimally under a number of different sets of network conditions as defined by their Quality of Service (QoS) offerings. This paper reviews some of the state-of-the-art and widely investigated MANET routing strategies in the literature. Moreover, a performance comparison of discussed routing protocol strategies is provided and suggestions are made to achieve improvement in performance of these protocols. This research is followed by presenting further research that will be pursued to define a radically most optimum set of strategies to satisfy different types of application domains.


Engineering Applications of Artificial Intelligence | 2014

Cooperative game theoretic approach using fuzzy Q-learning for detecting and preventing intrusions in wireless sensor networks

Shahaboddin Shamshirband; Ahmed Patel; Nor Badrul Anuar; Miss Laiha Mat Kiah; Ajith Abraham

Abstract Owing to the distributed nature of denial-of-service attacks, it is tremendously challenging to detect such malicious behavior using traditional intrusion detection systems in Wireless Sensor Networks (WSNs). In the current paper, a game theoretic method is introduced, namely cooperative Game-based Fuzzy Q-learning (G-FQL). G-FQL adopts a combination of both the game theoretic approach and the fuzzy Q-learning algorithm in WSNs. It is a three-player strategy game consisting of sink nodes, a base station, and an attacker. The game performs at any time a victim node in the network receives a flooding packet as a DDoS attack beyond a specific alarm event threshold in WSN. The proposed model implements cooperative defense counter-attack scenarios for the sink node and the base station to operate as rational decision-maker players through a game theory strategy. In order to evaluate the performance of the proposed model, the Low Energy Adaptive Clustering Hierarchy (LEACH) was simulated using NS-2 simulator. The model is subsequently compared against other existing soft computing methods, such as fuzzy logic controller, Q-learning, and fuzzy Q-learning, in terms of detection accuracy, counter-defense, network lifetime and energy consumption, to demonstrate its efficiency and viability. The proposed model׳s attack detection and defense accuracy yield a greater improvement than existing above-mentioned machine learning methods. In contrast to the Markovian game theoretic, the proposed model operates better in terms of successful defense rate.


Information Management & Computer Security | 2010

A survey of intrusion detection and prevention systems

Ahmed Patel; Qais Qassim; Christopher Wills

Purpose – The problem of protecting information and data flows has existed from the very first day of information exchange. Various approaches have been devised to protect and transfer such information securely. However, as technology and communications advance and information management systems become more and more powerful and distributed, the problem has taken on new and more complex dimensions and has become a major challenge. The widespread use of wired and wireless communication networks, internet, web applications and computing has increased the gravity of the problem. Organizations are totally dependent on reliable, secure and fault‐tolerant systems, communications, applications and information bases. Unfortunately, serious security and privacy breaches still occur every day, creating an absolute necessity to provide secure and safe information security systems through the use of firewalls, intrusion detection and prevention systems (ID/PSs), encryption, authentication and other hardware and softw...


Information Sciences | 2011

Review of pricing models for grid & cloud computing

Parnia Samimi; Ahmed Patel

Distributed system resources have become prevalent in ICT departments to lessen the burden of huge expenses incurred by very expensive storage computer systems. Add to this the continuous introduction and ever-growing evolution of simple to complex applications, the demand to access huge quantities of data, intensive computations, powerful simulations, maintaining and offering system resources and middleware infrastructure services the need to do all of this at an affordable and reasonable price is crucial. Distributed grid and cloud computing resources are currently considered to be one of the best technology options to provide this. They have many similar features and functions, and both of them are classed as distributed systems. They are capable of offering unaffordable resources and services at a reasonable price in a mass marketplace. The big question is: what is a reasonable price? How is pricing modeled and on what kind of economic principles is it based? Much of the issues surrounding these questions are very complex in themselves. This paper provides a comparative review of grid and cloud computing economic and pricing models from which appropriate tariffs and charging models can be chosen to meet particular business objectives. The actual choice depends on many other factors like enterprise regulations, tax laws, service level agreements and return on investments, are very important but outside the scope of this paper. In this paper we give the basic core principles and a comparative review of the latest and most appropriate economic and pricing models applicable to grid and cloud computing in order to propose better models for the future.


international conference on electrical engineering and informatics | 2009

A comparative review of IPv4 and IPv6 for research test bed

Mohd. Khairil Sailan; Rosilah Hassan; Ahmed Patel

The Internet is migrating from IPv4 to IPv6. To determine the features for research test bed product selection, we compare the up-to-date information of IPv4 and IPv6. Currently IPv6 network penetration is still low but it is expected to grow, while IPv4 address pool is projected by Regional Internet Registry to be exhausted by the end of 2011. The reason why uptake of IPv6 is still low is because of high cost of service migration from IPv4 to IPv6, successfully used of IPv4 Network Address Translation for Intranet and unproven return of investment in IPv6 technology. This paper aims to review few migration path from IPv4 to IPv6 and some of the existing IPv6 products.


Computer Standards & Interfaces | 2008

Secure and auditable agent-based e-marketplace framework for mobile users

Norleyza Jailani; Noor Faezah Mohd Yatim; Yazrina Yahya; Ahmed Patel; Mazliza Othman

Mobile agent based e-marketplace auction trading requires a secure and auditable system with a solid framework to support it. In investigating the requirements, it transpired that there is a lack of such a standardised framework. While mobility helps in avoiding network latency, particularly in increasing fairness in applications with bounded response times such as trading of auctions, it nevertheless raises issues concerning security, privacy and trust in protecting personal confidential information, managing and regulating legitimate trading and payment processing. These issues are of paramount importance and must be taken into consideration when designing a framework for modelling an auditable e-marketplace for mobile users. This also implies that there is also an underlying need to provide mobile users with simple, transparent and unobtrusive user interface. This paper proposes a framework that accommodates these requirements through protocol scenarios and highlight further research work that need to be performed.


Computer Standards & Interfaces | 2011

Application of structured document parsing to focused web crawling

Ahmed Patel; Nikita Schmidt

The performance of a focused, or topic-specific Web robot can be improved by taking into consideration the structure of the documents downloaded by the robot. In the case of HTML, document structure is tree-like, defined by nested document elements (tags) and their attributes. By analysing this structure, a robot may use the text of certain HTML elements to prioritise documents for downloading and thus significantly improve the speed of convergence to a topic. Clear separation of the structure-aware document parser from the download scheduler provides flexibility but requires a standard interface and protocol between the two. The paper discusses such an interface in the context of an experimental Web robot, whose speed of convergence to a topic was observed to increase by a factor of 3 to 8, as measured by the number of documents downloaded to reach a given average relevance score.


Library Hi Tech News | 2011

Comparative study and review of grid, cloud, utility computing and software as a service for use by libraries

Ahmed Patel; Ali Seyfi; Yiqi Tew; Ayman Jaradat

Purpose – Grid computing, cloud computing (CC), utility computing and software as a service are emerging technologies predicted to result in massive consolidation as meta‐level computing services of everything beneath one umbrella in the future. The purpose of this study is to foster the understanding and differentiation, by using the three aforementioned types of computing technologies and software, as a service by both public and private libraries to meet their expectations and strategic objectives.Design/methodology/approach – The approach in this study is a review based on comparing the four computing technologies with a brief analysis for researching and designing the mind map of a new meta‐level computing service approach, taking into consideration the need for new economic tariff and pricing models as well as service‐level agreements.Findings – Since it is anticipated that there will be likely potential consolidation and integration of computing services, a study of these four most advanced computi...


Library Hi Tech | 2011

Evaluation of cheating detection methods in academic writings

Ahmed Patel; Kaveh Bakhtiyari; Mona Taghavi

Purpose – This paper aims to focus on plagiarism and the consequences of anti‐plagiarism services such as Turnitin.com, iThenticate, and PlagiarismDetect.com in detecting the most recent cheatings in academic and other writings.Design/methodology/approach – The most important approach is plagiarism prevention and finding proper solutions for detecting more complex kinds of plagiarism through natural language processing and artificial intelligence self‐learning techniques.Findings – The research shows that most of the anti‐plagiarism services can be cracked through different methods and artificial intelligence techniques can help to improve the performance of the detection procedure.Research limitations/implications – Accessing entire data and plagiarism algorithms is not possible completely, so comparing is just based on the outputs from detection services. They may produce different results on the same inputs.Practical implications – Academic papers and web pages are increasing over time, and it is very ...

Collaboration


Dive into the Ahmed Patel's collaboration.

Top Co-Authors

Avatar

Mona Taghavi

National University of Malaysia

View shared research outputs
Top Co-Authors

Avatar

Rodziah Latih

National University of Malaysia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Kenan Kalajdzic

National University of Malaysia

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Abdullah Mohd Zin

National University of Malaysia

View shared research outputs
Top Co-Authors

Avatar

Qais Qassim

National University of Malaysia

View shared research outputs
Top Co-Authors

Avatar

Ali Seyfi

George Washington University

View shared research outputs
Top Co-Authors

Avatar

Liu Na

National University of Malaysia

View shared research outputs
Researchain Logo
Decentralizing Knowledge