Banani Saha
University of Calcutta
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Banani Saha.
International Journal of Systems Science | 1991
Banani Saha; Subhansu Bandyopadhyay
Petri nets have become a useful tool for modelling various classes of system, especially concurrent systems. A Petri net model for a real-life system may become quite large in dimension (having a large number of reachable states and events) and it becomes quite difficult to analyse the net. In such a situation, a large net is reduced to nets and subnets of lower dimension (wherein desirable properties are preserved) which are subsequently analysed. The present paper proposes a computer-oriented hierarchical reduction technique for Petri nets utilizing a recently proposed reflexive incidence matrix representation. Given the incidence matrix representation of the net, the associated software developed for this purpose performs step-by-step reduction and produces a reduced net of smaller dimension suitable for analysis and related tasks.
International Journal of Electronics | 1988
Banani Saha; Subhansu Bandyopadhyay
Petrinets have already been used as an effective tool for modelling and analysing dynamical systems. Behaviour of a net is generally studied through some structural properties like boundedness, conservativeness, etc. Available literature on such studies provide methods unsuitable for computer implementation—a characteristic often necessary to analyse large systems. In this paper, a relation between marking and firing vectors has been defined and utilized to obtain a simple and systematic technique of analysing petrinets. Illustrative examples have also been provided.
FICTA (2) | 2015
Boudhayan Bhattacharya; Banani Saha
Data fusion is generally defined as the application of methods that combines data from multiple sources and collect that information in order to get conclusions. This paper analyzes the signalling cost of different data fusion filter models available in the literature with the new community model. The signalling cost of the Community Model has been mathematically formulated by incorporating the normalized signalling cost for each transmission. This process reduces the signalling burden on master fusion filter and improves throughput. A comparison of signalling cost of the existing data fusion models along with the new community model has also been presented in this paper. The results show that our community model incurs improvement with respect to the existing models in terms of signalling cost.
international conference on advanced computing | 2006
Banani Saha; Suddha S. Mukhoapdhyay
The major aim of this paper is possible analysis of fraud activities prevalent in the telecommunication industry particularly regarding subscription fraud problems. A state variable model has been designed for analysis. This model is the basis and explains the necessity of profiling customers details and classifying and slabbing flat security deposits. This model is logically explained and is evaluated on the basis of a sample data set. Moreover predictability and interpretability of different graphs drawn on the basis of the data is explained with approximate formula. Finally the question of synthetic data generation for construction of a suitable learning rule has also been addressed.
Archive | 2017
Boudhayan Bhattacharya; Banani Saha
The scientific experiments handle huge amount of data from various sources. The processing of data includes various computing stages along with their dependency pattern. The scientific workflow for data-intensiveness is used to model different processes. The scientific workflow paradigm integrates, structures and orchestrates services of heterogeneity and software design tools locally and globally to form scientific processes with complexity for enabling scientific discoveries. The Scientific Workflow Management System (SWfMS) deploys the scientific workflows for data-intensiveness by means of executing parallelism and the resources distributed in different infrastructures like grid and cloud. The community model is a data fusion methodology which is used to fuse data from various sources with multiplicity. The SWfMS for community model is used to describe the flow of data in various parts of the model and their corresponding working principle. This paper presents a data-intensive SWfMS for the community model.
Archive | 2017
Supriyo Banerjee; Biswajit Maiti; Banani Saha
Quantum Key Distribution (QKD) is used for generating key between two legitimate user Alice and Bob. In the past decade, QKD using qubit has been investigated for different cryptographic strategies. Analysis of qutrit photon states in QKD is theoretically investigated and different eavesdropping strategies have been discussed in this paper. Modified two stage Positive Operator Value Measurement (POVM) is used to determine the qutrit states. An expression for error rate in data transmission has been formulated both in case of entangled and opaque eavesdropping. A protocol for encryption of data in qutrit states in one-time-pad scheme is initiated to improve on information reconciliation.
Archive | 2016
Sarbani Dasgupta; Banani Saha
Association rule mining can be applied in the field of bioinformatics for identification of co-occurrences between various biological elements such as genes and protein. In bioinformatics, protein–protein interaction network provides useful information regarding the functions of protein. Association analysis has been used for identification of frequently occurring interactions among the proteins in the network for predicting the functions of proteins. As the amount of data is increasing exponentially, parallel implementation of association analysis for identification of co-occurrences between proteins in protein–protein interaction network will be more efficient, fast, and scalable. In this paper we proposed an efficient framework for association analysis of frequently occurring pattern in the protein–protein interaction network. The algorithm has been parallelized using Hadoop software. The performance view of the parallel algorithm has been depicted in graph and it shows that the parallel version is more effective than the sequential one.
international conference on advanced computing | 2011
Anshuman Biswas; Banani Saha; Saswati Guha
Mobile Ad-hoc Networks are a collection of two or more devices equipped with wireless communications and multi-hop networking capability. We compare the performance of two prominent on demand routing protocols for mobile ad hoc networks--Dynamic Source Routing (DSR) and Ad Hoc On-Demand Distance Vector Routing (AODV). We exhibit that though DSR and AODV share an inherent on-demand behavior, the nuances in their protocol mechanics lead to performance differentials which are analyzed by varying the network size, network load and mobility. The simulation is carried out in NS2 where we employ the new trace format [1] to base our comparisons. We present a novel approach to analyze the protocols by varying parameters and making comparison in multiple dimensions to facilitate accurate observations. Additionally, we anatomize AODV which performs admirably in all but low mobility situations and make recommendations to enhance performance.
International Journal of Electronics | 1989
Banani Saha
A software implementation of a logic simulator capable of testing combinational circuits is presented. Exhaustive testing methodologies like syndrome and index vector testing are applied to make the testing procedure simpler. However, a-tests and b-tests have to be generated and applied in the case of index vector untestable circuits. The method is capable of handling both single and multiple faults. Sample runs have also been included.
International Journal of Systems Science | 1988
Banani Saha; Subhansu Bandyopadhyay
Abstract Petrinets have become a useful tool for modelling various classes of systems, especially those involving parallel computations and concurrent processes. The present paper deals with a method for identification of the Petrinet model from a given set of input-output observations. The method actually involves a modification of the system identification techniques of linear system theory. It has been shown that, by assuming the net to be a discrete-time linear dynamical system, it is possible to arrive at a reduced-order Petrinet model for a dynamical process. This procedure will enable one to obtain a Petrinet model for subsequent analysis systematically, thus bypassing normally used trial-and-error techniques.