Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Saad Biaz is active.

Publication


Featured researches published by Saad Biaz.


Proceedings 1999 IEEE Symposium on Application-Specific Systems and Software Engineering and Technology. ASSET'99 (Cat. No.PR00122) | 1999

Discriminating congestion losses from wireless losses using inter-arrival times at the receiver

Saad Biaz; Nitin H. Vaidya

We present a simple scheme which enables a TCP receiver to distinguish congestion losses from corruption losses. The scheme works in the case where the last hop to the receiver is a wireless link and has the smallest bandwidth among all links on the connection path. We added our mechanism to TCP-Reno to evaluate the performance improvement. We compared our scheme against Ideal TCP-Reno which is TCP-Reno that can perfectly (but artificially) distinguish between congestion losses and wireless transmission losses. Under favourable conditions, our scheme performs similar to Ideal TCP-Reno and can lead to significant throughput improvement.


international conference on computer communications and networks | 1998

Distinguishing congestion losses from wireless transmission losses: a negative result

Saad Biaz; Nitin H. Vaidya

The TCP is a popular transport protocol used in the present-day Internet. When packet losses occur the TCP assumes that the packet losses are due to congestion, and responds by reducing its congestion window. When a TCP connection traverses a wireless link, a significant fraction of packet losses may occur due to transmission errors. The TCP responds to such losses also by reducing the congestion window. This results in unnecessary degradation in the TCP performance. We define a class of functions named loss predictors which may be used by a TCP sender to guess the actual cause of a packet loss (congestion or transmission error) and take appropriate actions. These loss predictors use simple statistics on round-trip times and/or throughput, to determine the cause of a packet loss. We investigate their ability to determine the cause of a packet loss. Unfortunately, our simulation measurements suggest that the three loss predictors do not perform too well.


wireless communications and networking conference | 2005

Towards the performance analysis of IEEE 802.11 in multi-hop ad-hoc networks

Yawen Dai Barowski; Saad Biaz; Prathima Agrawal

The performance of IEEE 802.11 in multi-hop wireless networks depends on the characteristics of the protocol itself, and on those of the upper layer routing protocol. Extensive work has been done to analyze and evaluate the performance of single hop networks under saturated traffic conditions, either through simulations or mathematical modeling. Little work has been done on the analysis of the performance of IEEE 802.11 protocol under the unsaturated traffic conditions that arise in multi-hop networks. The paper proposes analytical models and scenarios for such an analysis. ns-2 simulations with different network configurations validate the proposed models for performance metrics such as throughput, message delay, average queue length, and energy consumption. Simulation results show that the proposed models work well. Key to the modeling of the multi-hop networks is a treatment of the upper layer routing protocols that can affect the network performance through the way they forward packets and the impact of that on the traffic load. The model proposed takes into account the impact of the upper layer routing protocol by introducing a packet acceptance factor with which each relay station accepts packets from the wireless medium before forwarding the same.


International Journal of Mobile Communications | 2005

A survey and comparison on localisation algorithms for wireless ad hoc networks

Saad Biaz; Yiming Ji

In wireless ad hoc networks, nodes location information is useful for efficient routing and location-aware applications. This paper surveys this active area of research and presents a detailed comparison of most localisation algorithms in literature. Hardware requirements, reliability and accuracy are reviewed. The localisation methods are evaluated and compared under the same network settings.


wireless communications and networking conference | 2008

Loss Differentiated Rate Adaptation in Wireless Networks

Saad Biaz; Shaoen Wu

Data rate adaptation aims to select the optimal data rates for current channel conditions leading to substantial performance improvement. This paper proposes a data rate adaptation technique that: (1) exploits the periodic IEEE 802.11 beacons; (2) discriminates between frame losses due to channel fading and those due to collisions and takes actions appropriate for each type of loss; and (3) recommends and justifies the use of the lowest data rate for the very first retransmission after a frame loss. The last feature - namely retransmitting at the lowest data rate - helps in diagnosing the real cause of a frame loss. Moreover, this work analytically shows that retransmitting at the lowest data rate is more efficient, especially in poor SNR environments or when there is no knowledge of the cause of a loss (channel degradation or transmission collision). This scheme, dubbed loss differentiated rate adaptation (LDRA), is extensively evaluated through simulations and shown to perform better especially when network traffic is heavy.


IEEE Internet of Things Journal | 2014

Comparative Investigation on CSMA/CA-Based Opportunistic Random Access for Internet of Things

Chong Tang; Lixing Song; Jagadeesh Balasubramani; Shaoen Wu; Saad Biaz; Qing Yang; Honggang Wang

Wireless communication is indispensable to Internet of Things (IoT). Carrier sensing multiple access/collision avoidance (CSMA/CA) is a well-proven wireless random access protocol and allows each node of equal probability in accessing wireless channel, which incurs equal throughput in long term regardless of the channel conditions. To exploit node diversity that refers to the difference of channel condition among nodes, this paper proposes two opportunistic random access mechanisms: overlapped contention and segmented contention, to favor the node of the best channel condition. In the overlapped contention, the contention windows of all nodes share the same ground of zero, but have different upper bounds upon channel condition. In the segmented contention, the contention window upper bound of a better channel condition is smaller than the lower bound of a worse channel condition; namely, their contention windows are segmented without any overlapping. These algorithms are also polished to provide temporal fairness and avoid starving the nodes of poor channel conditions. The proposed mechanisms are analyzed, implemented, and evaluated on a Linux-based testbed and in the NS3 simulator. Extensive comparative experiments show that both opportunistic solutions can significantly improve the network performance in throughput, delay, and jitter over the current CSMA/CA protocol. In particular, the overlapped contention scheme can offer 73.3% and 37.5% throughput improvements in the infrastructure-based and ad hoc networks, respectively.


international conference on computer communications and networks | 2007

Optimal Sniffers Deployment On Wireless Indoor Localization

Yiming Ji; Saad Biaz; Shaoen Wu; Bing Qi

Location determination of indoor mobile users is challenging due the complex and volatile indoor radio propagation signals. A radio-frequency (RF) based indoor localization system, like RADAR or ARIADNE, typically operates by first constructing a lookup table mapping the radio signal strength at different known locations in the building, and then a mobile users location at an arbitrary point in the building is determined by measuring the signal strength at the location in question and searching the corresponding location from the above lookup table. Usually, the mobiles signal strength is measured by three or more sniffers deployed inside the building. Obviously, the number of sniffers and their positions greatly affect the localization performance. This paper presents a detailed analysis and experimental results that explore the impact of the sniffers deployment on the performance of the indoor localization. The results demonstrate that the best localization performance is obtained when the center of gravity of the equilateral triangular (formed by three sniffers) coincides with that of the floor plan; and in order to provide optimal localization for all positions of a large floor, it is necessary to deploy more than three sniffers in a semi-mesh style such that any position in the building is always covered by three nearby sniffers.


symposium on reliable distributed systems | 1998

Tolerating Visitor Location Register failures in mobile environments

Saad Biaz; Nitin H. Vaidya

For mobile users who move frequently but receive relatively rare calls, a forwarding scheme has been shown to outperform the normal IS-41 location management scheme. But the forwarding scheme is more vulnerable to failure of intermediate Visitor Location Registers (VLRs) than the IS-41 scheme. We propose two simple variations to the forwarding scheme to address the fault tolerance weakness. One is based on the idea of maintaining two paths from the home location server to the last VLR. The second scheme is based on the knowledge of the neighbors of the faulty VLR. We evaluate and compare the performance of these location management schemes.


wireless communications and networking conference | 2008

iETT: A Quality Routing Metric for Multi-Rate Multi-Hop Networks

Saad Biaz; Bing Qi

Routing metrics are critical to select a path that yields a high throughput in multi-rate multi-hop wireless networks. This paper illustrates the weaknesses of existing routing metrics and proposes a new routing metric called Improved Expected Transmission Time (iETT). iETT addresses two key characteristics of a routing path that other routing metrics ignored: 1) a high discrepancy in packet loss rates of links and 2) the position of the links with different packet loss rates. By capturing these two characteristics, the iETT metric is able to choose a route with a better performance than shortest path, Expected Transmission Count (ETX) and Expected Transmission Time (ETT) metrics.


Journal of Simulation | 2014

A symbiotic simulation architecture for evaluating UAVs collision avoidance techniques

James Holt; Saad Biaz; Levent Yilmaz; C Affane Aji

Unmanned aerial vehicles (UAVs) currently have many real-world applications such as surveying, delivering small packages, and military applications. In order to maintain safe flight, UAVs require collision avoidance to divert them from collisions or other known dangerous paths. Many algorithms have been developed and implemented, but seldom are they compared to each other analytically within a single system. In this paper, an implementation of a symbiotic simulation architecture is described to conduct cyber-physical experiments and to allow different collision avoidance algorithms to be tested in real-time. Three algorithms based on mixed-integer linear programming (MILP), sparse A* search, and artificial potential fields (APF) are implemented, and results of computational experiments are presented. MILP provides the most efficient paths when given enough time. A* and APF exhibit similar performance, with APF being the least computing expensive. On the basis of the results of the experiments, three hybrid algorithms are proposed for future tests using the embedded simulation architecture.

Collaboration


Dive into the Saad Biaz's collaboration.

Top Co-Authors

Avatar

Shaoen Wu

Ball State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Chong Tang

University of Southern Mississippi

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge