Low Tang Jung
Universiti Teknologi Petronas
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Low Tang Jung.
Wireless Personal Communications | 2014
Tariq Ali; Low Tang Jung; Ibrahima Faye
Providing better communication and maximising the communication performance in a Underwater Wireless Sensor Network (UWSN) is always challenging due to the volatile characteristics of the underwater environment. Radio signals cannot properly propagate underwater, so there is a need for acoustic technology that can support better data rates and reliable underwater wireless communications. Node mobility, 3-D spaces and horizontal communication links are some critical challenges to the researcher in designing new routing protocols for UWSNs. In this paper, we have proposed a novel routing protocol called Layer by layer Angle-Based Flooding (L2-ABF) to address the issues of continuous node movements, end-to-end delays and energy consumption. In L2-ABF, every node can calculate its flooding angle to forward data packets toward the sinks without using any explicit configuration or location information. The simulation results show that L2-ABF has some advantages over some existing flooding-based techniques and also can easily manage quick routing changes where node movements are frequent.
2012 International Symposium on Telecommunication Technologies | 2012
Tariq Ali; Low Tang Jung; Sadia Ameer
Providing better communication in UWSNs and maximize the communication performance is challenging issue due to volatile characteristics of underwater environment. Radio signal cannot properly propagate in underwater, so there is need, such type of acoustic modem technology that can supports better data rates and reliable underwater wireless communications. Further, nodes mobility, 3-D spaces brings some critical challenges for the researcher to design new routing protocol for UWSNs, to handle the issues like high end to end delays, Propagation delay, as well as power constraints of the sensor nodes. Many routing algorithms have been proposed in last two decades and some of them provide better solution to handle such type of issues. In this paper, we proposed a novel routing protocol called Layer by layer Angle Based Flooding (L2-ABF.), to address the problem node swaying issue, end-to-end delays, and node energy consumption. In L2-ABF, every node can calculate its flooding angle to forward data packet to the next upper layer toward the surface sinks without using any explicit configuration and location information. The simulation results show that L2-ABF has some advantages over some existing flooding base techniques and can easily manage quick routing changes where node movements are frequent.
international conference on computer and information sciences | 2014
Hasan Farooq; Low Tang Jung
Communication network is considered to be a vital component and cornerstone of smart grids. It is envisaged that design of communication network will be the most challenging task for transforming ageing power grid into a modern smart grid. Many communication technologies existing today can be utilized for enabling communication among smart grid entities, however, each one of them has its own tradeoffs which needs to be considered. Hierarchically, smart grid communication network is made up of three area networks each having its own requirements and limitations. It is, therefore, necessary to analyze the communication technologies in perspective of these three area networks of smart grids. This paper presents the choices available for enabling communication in respective area networks of smart grids. Moreover, performances of IEEE 802.11 and IEEE 802.15.4 standards are evaluated using NS-2 simulations for deployment of multi-hop ad-hoc architecture of smart grid communication network.
Quantum Information Processing | 2017
Mustapha Yusuf Abubakar; Low Tang Jung; Nordin Zakaria; Ahmed Younes; Abdel-Haleem Abdel-Aty
We have defined a new method for automatic construction of reversible logic circuits by using the genetic programming approach. The choice of the gate library is 100% dynamic. The algorithm is capable of accepting all possible combinations of the following gate types: NOT TOFFOLI, NOT PERES, NOT CNOT TOFFOLI, NOT CNOT SWAP FREDKIN, NOT CNOT TOFFOLI SWAP FREDKIN, NOT CNOT PERES, NOT CNOT SWAP FREDKIN PERES, NOT CNOT TOFFOLI PERES and NOT CNOT TOFFOLI SWAP FREDKIN PERES. Our method produced near optimum circuits in some cases when a particular subset of gate types was used in the library. Meanwhile, in some cases, optimal circuits were produced due to the heuristic nature of the algorithm. We compared the outcomes of our method with several existing synthesis methods, and it was shown that our algorithm performed relatively well compared to the previous synthesis methods in terms of the output efficiency of the algorithm and execution time as well.
international conference on computer and information sciences | 2014
Munwar Ali Zardari; Low Tang Jung; Nordin Zakaria
Securing the data in cloud is still a challenging issue. In cloud, many techniques are used to secure the data. Data encryption is a data security technique which is widely used for data security. Deciding data security approach for the data without understanding the security needs of the data is not a technically valid approach. Before applying any security on data in cloud, it is best to know the security needs of the data. What data need security and what data do not need security. In this paper, we propose a data classification approach based on data confidentiality. K-NN data classification technique is modulated in the cloud virtual environment. The aim to use K-NN is to classify the data based on their security needs. The data is classified into two classes, sensitive and non-sensitive (public) data. After data classification we got that what data need security and what data does not need security. The RSA algorithm is used to encrypt the sensitive data to keep it secure. We used CloudSim simulator to find the results of proposed approach in cloud. The proposed approach will easily decide the security needs of the data. After the data classification, it is easy to select an appropriate security for data according to the need of data. The results show that this approach is appropriate as compared to store data in cloud without understanding the security requirements of data.
international conference on advanced computer science applications and technologies | 2013
Munwar Ali Zardari; Low Tang Jung; Muhamed Nording B. Zakaria
Cloud computing is one of the most exciting paradigm shifts in distributed computing. Cloud computing is being widely used due to its readily available services at low cost. When the number of cloud users increase this may subsequently lead to data security and privacy threats. Data confidentiality and efficient data retrieval are major issues which block users to adopt cloud computing. The focus of this paper is data confidentiality and efficient data retrieval issues in cloud computing. The data classification and cloud model are proposed to overcome data confidentiality and efficient data retrieval issues. In this paper we propose a new cloud model, i.e., the Hybrid Multi-Cloud Data Security (HMCDS) model with data classification. The model is based on multi-cloud, different clusters and data classification. Two levels of security are considered, i.e., model and data classification based levels.
IOP Conference Series: Earth and Environmental Science | 2013
Hasan Farooq; Low Tang Jung
Today no one can deny the need for Smart Grid and it is being considered as of utmost importance to upgrade outdated electric infrastructure to cope with the ever increasing electric load demand. Wireless Sensor Network (WSN) is considered a promising candidate for internetworking of smart meters with the gateway using mesh topology. This paper investigates the performance of AODV routing protocol for WSN based smart metering deployment. Three case studies are presented to analyze its performance based on four metrics of (i) Packet Delivery Ratio, (ii) Average Energy Consumption of Nodes (iii) Average End-End Delay and (iv) Normalized Routing Load.
Cluster Computing | 2016
Munwar Ali Zardari; Low Tang Jung
Personal and organizational data are getting larger in volume with respect to time. Due to the importance of data for organisations, effective and efficient management and categorization of data need a special focus. Understanding and applying data security policies to the appropriate data types therefore is one of the core concerns in large organisations such as cloud service providers. With data classification, the identification of security requirements for the data can be accomplished without manual intervention where the encryption process is applied only to the confidential data thus saving encryption time, decryption time, storage and processing power. The proposed data classification approach is to reduce the network traffic, the additional data movement, the overload, and the storage place for confidential data can be decided where security requirements of the confidential data are fulfilled. In this paper, an intelligent data classification approach is presented for predicting the confidentiality/sensitivity level of the data in a file based on the corporate objective and government policies/rules. An enhanced version of the k-NN algorithm is also proposed to reduce the computational complexity of the traditional k-NN algorithm at data classification phase. The proposed algorithm is called Training dataset Filtration-kNN (TsF-kNN). The experimental results show that data in a file can be classified into confidential and non-confidential classes and TsF-kNN algorithm has better performance against the traditional k-NN and Naïve Bayes algorithm.
International journal of business | 2016
Munwar Ali Zardari; Low Tang Jung
Cloud computing is a new paradigm model that offers different services to its customers. The increasing number of users for cloud services i.e. software, platform or infrastructure is one of the major reasons for security threats for customers’ data. Some major security issues are highlighted in data storage service in the literature. Data of thousands of users are stored on a single centralized place where the possibility of data threat is high. There are many techniques discussed in the literature to keep data secure in the cloud, such as data encryption, private cloud and multiple clouds concepts. Data encryption is used to encrypt the data or change the format of the data into the unreadable format that unauthorized users could not understand even if they succeed to get access of the data. Data encryption is very expensive technique, it takes time to encrypt and decrypt the data. Deciding the security approach for data security without understanding the security needs of the data is a technically not a valid approach. It is a basic requirement that one should understand the security level of data before applying data encryption security approach. To discover the data security level of the data, the authors used machine learning approach in the cloud. In this paper, a data classification approach is proposed for the cloud and is implemented in a virtual machine named as Master Virtual Machine (Vmm). Other Vms are the slave virtual machines which will receive from Vmm the classified information for further processing in cloud. In this study the authors used three (3) virtual machines, one master Vmm and two slaves Vms. The master Vmm is responsible for finding the classes of the data based on its confidentiality level. The data is classified into two classes, confidential (sensitive) and non-confidential (non-sensitive/public) data using K-NN Classification of File Data Based on Confidentiality in Cloud Computing Using K-NN Classifier
international conference on computer and information sciences | 2016
Low Tang Jung; Oi-Mean Foong; Pramita Winata
The demand for data center (DC) has been increasing significantly due to the rapid growth in ICT technology. This brings along the “green” issues in data center such as energy consumption, heat generation and cooling requirements. These issues can be addressed by “Green of/by IT” approach in the context of operating costs as well as the environmental impacts. To install temperature monitoring system in every corner of data center is certainly cost inefficient. Optimizing the number of sensors deployed in DC is thus important for reducing the monitoring cost. This project aims to create a wireless temperature monitoring system with an optimizing technique to optimize the number of temperature sensors deployed in a DC. The real-time temperature data collected by this system can also be used to predict the next state of the temperature in DC to detect potential anomaly in heat generation. Quick preventive response can thus be invoked to manage this potential hot spots in DC. This could be a promising green by IT approach.