Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Mohammed Zaki Ahmed is active.

Publication


Featured researches published by Mohammed Zaki Ahmed.


Nature | 2008

Scaling laws of marine predator search behaviour

David W. Sims; Emily J. Southall; Nicolas E. Humphries; Graeme C. Hays; Jonathan W. Pitchford; Alex James; Mohammed Zaki Ahmed; Andrew S. Brierley; Mark A. Hindell; David Morritt; Michael K. Musyl; David Righton; Emily L. C. Shepard; Victoria J. Wearmouth; Rory P. Wilson; Matthew J. Witt; Julian D. Metcalfe

Many free-ranging predators have to make foraging decisions with little, if any, knowledge of present resource distribution and availability. The optimal search strategy they should use to maximize encounter rates with prey in heterogeneous natural environments remains a largely unresolved issue in ecology. Lévy walks are specialized random walks giving rise to fractal movement trajectories that may represent an optimal solution for searching complex landscapes. However, the adaptive significance of this putative strategy in response to natural prey distributions remains untested. Here we analyse over a million movement displacements recorded from animal-attached electronic tags to show that diverse marine predators—sharks, bony fishes, sea turtles and penguins—exhibit Lévy-walk-like behaviour close to a theoretical optimum. Prey density distributions also display Lévy-like fractal patterns, suggesting response movements by predators to prey distributions. Simulations show that predators have higher encounter rates when adopting Lévy-type foraging in natural-like prey fields compared with purely random landscapes. This is consistent with the hypothesis that observed search patterns are adapted to observed statistical patterns of the landscape. This may explain why Lévy-like behaviour seems to be widespread among diverse organisms, from microbes to humans, as a ‘rule’ that evolved in response to patchy resource distributions.


Biology Letters | 2014

Scaling laws of ambush predator 'waiting' behaviour are tuned to a common ecology.

Victoria J. Wearmouth; Matthew J. McHugh; Nicolas E. Humphries; Aurore Naegelen; Mohammed Zaki Ahmed; Emily J. Southall; Andy M. Reynolds; David W. Sims

The decisions animals make about how long to wait between activities can determine the success of diverse behaviours such as foraging, group formation or risk avoidance. Remarkably, for diverse animal species, including humans, spontaneous patterns of waiting times show random ‘burstiness’ that appears scale-invariant across a broad set of scales. However, a general theory linking this phenomenon across the animal kingdom currently lacks an ecological basis. Here, we demonstrate from tracking the activities of 15 sympatric predator species (cephalopods, sharks, skates and teleosts) under natural and controlled conditions that bursty waiting times are an intrinsic spontaneous behaviour well approximated by heavy-tailed (power-law) models over data ranges up to four orders of magnitude. Scaling exponents quantifying ratios of frequent short to rare very long waits are species-specific, being determined by traits such as foraging mode (active versus ambush predation), body size and prey preference. A stochastic–deterministic decision model reproduced the empirical waiting time scaling and species-specific exponents, indicating that apparently complex scaling can emerge from simple decisions. Results indicate temporal power-law scaling is a behavioural ‘rule of thumb’ that is tuned to species’ ecological traits, implying a common pattern may have naturally evolved that optimizes move–wait decisions in less predictable natural environments.


ukacc international conference on control | 2012

Fault detection and diagnosis using Principal Component Analysis of vibration data from a reciprocating compressor

Mohammed Zaki Ahmed; Mabrouka Baqqar; Fengshou Gu; Andrew Ball

This paper investigates the use of time domain vibration features for detection and diagnosis of different faults from a multi stage reciprocating compressor. Principal Component Analysis (PCA) is used to develop a detection and diagnosis framework in that the effective diagnostic features are selected from PCA of 14 potential features and a PCA model based detection method using Hotellings T2 and Q statistics is subsequently developed to detect various faults including suction valve leakage, inter-cooler leakage, loose drive belt, and combinations of discharge valve leakage with suction valve leakage, suction valve leakage with intercooler leakage and discharge valve leakage with intercooler leakage. A study of Q -contributions has found two original features: Histogram Lower Bound and Normal Negative log-likelihood which allow full classification of different simulated faults.


IEEE Transactions on Magnetics | 2007

Improved Data Recovery from Patterned Media With Inherent Jitter Noise Using Low-Density Parity-Check Codes

Ioannis T. Ntokas; P.W. Nutter; Cen Jung Tjhai; Mohammed Zaki Ahmed

Patterned magnetic media promises areal densities in excess of 1 Tbit/in2 for data storage. However, current imperfect patterning techniques result in a variation in the dimensions and distribution of the fabricated islands. As a result, this variation introduces jitter in the replay waveform that makes data recovery difficult. In this paper, we investigate the use of low-density parity-check (LDPC) codes and iterative decoding for mitigating the effects of lithography jitter and improving the read channel performance in patterned media storage systems. In addition, we show that the adoption of LDPC coding techniques permits an increase in the data storage capability of the medium to approximately 1.6 Tbit/in2 with acceptable bit-error-rate performance.


international conference on conceptual structures | 2006

Some Results on the Weight Distributions of the Binary Double-Circulant Codes Based on Primes

Cen Jung Tjhai; Martin Tomlinson; R. Horan; Mohammed Zaki Ahmed; Marcel Ambroze

This paper presents a more efficient algorithm to count codewords of given weights in self-dual double-circulant and formally self-dual quadratic double-circulant codes over GF(2). A method of deducing the modular congruence of the weight distributions of the binary quadratic double-circulant codes is proposed. This method is based on that proposed by Mykkeltveit, Lam and McEliece, JPL. Tech. Rep., 1972, which was applied to the extended quadratic-residue codes. A useful application of this modular congruence method is to provide independent verification of the weight distributions of the extended quadratic-residue and quadratic double-circulant codes. Using this method in conjunction with the proposed efficient codeword counting algorithm, we are able i) to give the previously unpublished weight distributions of the [76, 38,12] and [124, 62, 20] binary quadratic double-circulant codes; ii) to provide corrections to the published results on the weight distributions of the binary extended quadratic-residue code of prime 151, and the number of codewords of weights 30 and 32 of the binary extended quadratic-residue code of prime 137; and iii) to prove that the [168, 84, 24] extended quadratic-residue and quadratic double-circulant codes are inequivalent


ieee international magnetics conference | 2002

Track squeeze using adaptive inter-track interference equalisation

Mohammed Zaki Ahmed; Paul Davey; T. Donnelly; W.W. Clegg

Summary form only given. A new maximum likelihood sequence detection (MLSD) technique to reduce the effects of inter-track interference (ITI) is presented in this paper. As track densities increase, ITI and off-track interference (OTI) occur, and have detrimental effects on the performance of current MLSD decoders. A solution is to perform MLSD with an adaptive ITI equaliser constructed within the decoder trellis.


Iet Communications | 2014

Best binary equivocation code construction for syndrome coding

Ke Zhang; Martin Tomlinson; Mohammed Zaki Ahmed; Marcel Ambroze; Miguel R. D. Rodrigues

Traditionally, codes are designed for an error correcting system to combat noisy transmission channels and achieve reliable communication. These codes can be used in syndrome coding, but it is shown in this study that the best performance is achieved with codes specifically designed for syndrome coding. In the view of the security of the communication, the best codes are the codes, which have the highest value of an information secrecy metric, the equivocation rate, for a given code length and code rate and are well packed codes. A code design technique is described, which produces the best binary linear codes for the syndrome coding scheme. An efficient recursive method to determine the equivocation rate for the binary symmetric channel and any linear binary code is also presented. A large online database of best equivocation codes for the syndrome coding scheme has been produced using the code design technique with some examples presented in the study. The presented results show that the best equivocation codes produce a higher level of secrecy for the syndrome coding scheme than almost all best known error correcting codes. Interestingly, it is unveiled that some outstanding best known error correcting codes are also best equivocation codes.


global information infrastructure and networking symposium | 2013

Secrecy coding for the wiretap channel using best known linear codes

Salah Al-Hassan; Mohammed Zaki Ahmed; Martin Tomlinson

A special case of wiretap channel is studied and analysed when the main channel is an error free channel and the eavesdropper channel is a binary symmetric channel. The goal of this work is to maximise the equivocation on the eavesdropper side by using a combination of the technique of the McEliece cryptosystem using Best Known Linear Codes(BKLC) coupled with syndrome coding. It is shown that as a result the communication security is improved. In this paper, two Best known linear codes are analysed which increase the equivocation on the eavesdropper side. Two encoding stages are employed. The first stage employs a syndrome coding scheme based on the (23,12,7) binary Golay code and the second stage employs the McEliece cryptosystem technique using BKLC. Analysis shows that the arrangement reduces the information leakage to the eavesdropper compared to previously published schemes.


Journal of Physics: Conference Series | 2012

Fault Detection of Reciprocating Compressors using a Model from Principles Component Analysis of Vibrations

Mohammed Zaki Ahmed; Fengshou Gu; Andrew Ball

Traditional vibration monitoring techniques have found it difficult to determine a set of effective diagnostic features due to the high complexity of the vibration signals originating from the many different impact sources and wide ranges of practical operating conditions. In this paper Principal Component Analysis (PCA) is used for selecting vibration feature and detecting different faults in a reciprocating compressor. Vibration datasets were collected from the compressor under baseline condition and five common faults: valve leakage, inter-cooler leakage, suction valve leakage, loose drive belt combined with intercooler leakage and belt loose drive belt combined with suction valve leakage. A model using five PCs has been developed using the baseline data sets and the presence of faults can be detected by comparing the T2 and Q values from the features of fault vibration signals with corresponding thresholds developed from baseline data. However, the Q -statistic procedure produces a better detection as it can separate the five faults completely.


Journal of Communications | 2011

Optimal and Suboptimal Multi Antenna Spectrum Sensing Techniques with Master Node Cooperation for Cognitive Radio Systems

Owayed Abdullah Alghamdi; Mohammed Zaki Ahmed

In this paper, we consider the primary user detection problem in cognitive radio systems by using multi antenna at the cognitive radio receiver. An optimal linear combiner multi antenna based spectrum sensing technique is proposed using the multitaper spectrum estimation method. A suboptimal square law combiner multi antenna based technique, using the multitaper method, is also proposed. The decision statistics’ probability density functions of the proposed techniques are derived theoretically. Probabilities of detection and false alarm formulae are presented using the Neyman Pearson criterion. Both proposed techniques are derived when energy detector is used. Based on our results, we found that the general likelihood ratio detector1 (GLRD1) and the blind GLRD that are proposed in the literature, require signal to noise ratios SNRs=7.5 and 9.6 dB, respectively to achieve a probability of detection of 99.99% at false alarm 1% with additive white Gaussian noise (AWGN) using 4 antennas and 16 samples for sensing. In our proposed optimal and suboptimal techniques, the required SNRs are found as 12 and 7.5 dB, respectively to achieve the same probabilities in the same conditions. Of course, this result gives an indication that even GLRD multi antenna based spectrum sensing techniques are blind in their philosophy, but that comes at the expense of their performance. Simulation results that confirm the theoretical work are also presented. An AWGN and Rayleigh flat fading environments are examined in the results. Finally, a new concept of cooperative spectrum sensing, the master node, is introduced.

Collaboration


Dive into the Mohammed Zaki Ahmed's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Marcel Ambroze

Plymouth State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge