Adesina Simon Sodiya
Federal University of Agriculture, Abeokuta
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Adesina Simon Sodiya.
Information Management & Computer Security | 2004
Adesina Simon Sodiya; H. O. D. Longe; Adio T. Akinwale
Researchers have used many techniques in designing intrusion detection systems (IDS) and yet we still do not have an effective IDS. The interest in this work is to combine techniques of data mining and expert systems in designing an effective anomaly‐based IDS. Combining methods may give better coverage, and make the detection more effective. The idea is to mine system audit data for consistent and useful patterns of user behaviour, and then keep these normal behaviours in profiles. An expert system is used as the detection system that recognizes anomalies and raises an alarm. The evaluation of the intrusion detection system design was carried out to justify the importance of the work.
Information Management & Computer Security | 2005
Adesina Simon Sodiya; H.O.D. Longe
Purpose – A work that combined strategies in designing anomaly‐based instruction detection system (IDS) was described in CSIDS. This new work seeks to improve on CSIDS.Design/methodology/approach – The shortcomings of CSIDS were first identified and critically analysed. An improved approach for combining data mining and expert system is then presented and implemented.Findings – The result of the evaluation of the new design produced a better result in terms of the detection efficiency and false alarm rate.Research limitations/implications – It might be necessary to use data from diverse environments to test the design. However, it was effectively shown that an IDS that combines strategies has been designed.Practical implications – This work discusses the technical issues of IDS and will motivate researchers in IDS. It has also shown how strategies could be combined for effective intrusion detection.Originality/value – This paper resulted from the existing problems in IDS and presents practical information...
Information Security Journal: A Global Perspective | 2015
Adebukola Onashoga; Olusola O. Abayomi-Alli; Adesina Simon Sodiya; David A. Ojo
ABSTRACT SMS Spam, which is an unsolicited or unwanted message, is a major problem with Global System for Mobile Communication (GSM) subscribers. Existing Spam filters have not been able to stop the SMS Spam problem due to frequent drift in spammer’s words, limited bag of words for training, device portability, and high computational overhead of filters. This paper presents a collaborative and adaptive server-side SMS Spam filter using Artificial Immune System (coined ExAIS_SMS). The proposed scheme involves five modules: the innate mechanism, the user feedback, the quarantine, the tokenizer, and the immune engine. In this study, a new English corpus consisting of 5,240 SMS messages from 20 different users was collected for the study. A comprehensive experimental analysis on the SMS data set reveals the constant changes of Spam keywords and the impact of user feedback for system adaptability. In order to prove the efficiency of the proposed scheme, ExAIS_SMS was benchmarked with existing systems using the NUS corpus. The result gave an overall accuracy of 99% for ExAIS_SMS, 98% for Bayesian, and 97% for a client side AIS. The results showed that ExAIS_SMS is an efficient SMS Spam filtering technique, especially in resource constrained mobile phones.
international conference on digital information processing and communications | 2016
S. Agholor; Adesina Simon Sodiya; A. T. Akinwale; O. J. Adeniran
Recently, researchers have developed Password Managers to correct the problems of memorability in authentication systems. However, existing techniques for storing credential information in these Password Managers are still not quite efficient and secured. In this work, a Mobile-Based Password Manager that creates faux passwords using Transformed-Based Algorithm and the Modified Levenshtein Distance was developed. A Decentralized File Format architecture was used to distribute credential information into different files for storage. The evaluation result from the experiment showed a system that improved security in Password Managers.
International Journal of Information Security and Privacy | 2016
Adesina Simon Sodiya; B Adegbuyi
Data and document privacy concerns are increasingly important in the online world. In Cloud Computing, the story is the same, as the secure processing of personal data represents a huge challenge The main focus is to to preserve and protect personally identifiable information PII of individuals, customers, businesses, governments and organisations. The current use of anonymization techniques is not quite efficient because of its failure to use the structure of the datasets under consideration and inability to use a metric that balances the usefulness of information with privacy preservation. In this work, an adaptive lossy decomposition algorithm was developed for preserving privacy in cloud computing. The algorithm uses the foreign key associations to determine the generalizations possible for any attribute in the database. It generates penalties for each obscured attribute when sharing and proposes an optimal decomposition of the relation. Postgraduate database of Federal University of Agriculture, Abeokuta, Nigeria and Adult database provided at the UCIrvine Machine Learning Repository were used for the evaluation. The result shows a system that could be used to improve privacy in cloud computing.
Issues in Informing Science and Information Technology | 2011
Adio T. Akinwale; Olusegun Folorunso; Adesina Simon Sodiya
Introduction Designing database is an art process similar to building a house. Database designers always face the problems of designing a relational database that will be free of database anomalies. These anomalies bring repetition of tuples that delay processing time and occupy memory spaces. Suppose that the value of the attribute BUILDER determines values of the attribute MODEL and PRICE, (BUILDER [right arrow] MODEL, PRICE) and that the value for the attribute MODEL determines the value for PRICE, (MODEL [right arrow] PRICE). Grouping these attributes in relation HOUSE(BUILDER, MODEL, PRICE) has several undesirable properties. First the relationship between MODEL and PRICE is repeated in the relation for each BUILDER who builds a particular MODEL of home. This repetition creates difficulties if a BUILDER who happens to be the last BUILDER of a certain MODEL home is deleted from the relation, then the relationship between the MODEL and its PRICE also disappears from the relation. This is called a deletion anomaly. Similarly, if a new builder who happens to be the first BUILDER of a certain MODEL home is added then the relationship between MODEL of a home and its PRICE will also be added. This is called an insertion anomaly. Suppose that the relationship between a MODEL and its PRICE is changed e.g. the price is increased; then the MODEL and PRICE relationship should be affected for every BUILDER of the MODEL. This is called update anomaly. These anomalies are undesirable since the user is not likely to realize the consequence of the insertion, deletion or updating. The user may inadvertently affect a relationship that was not intended to be modified. Consistency, insertion, deletion and updating are not probe effecting all groupings of attributes. If the relation HOUSE(BUILDER, MODEL, PRICE) is normalized then the consistency and anomaly problems disappear. Normalization is a step by step reversible process of replacing a given collection of relations by successive collection in which the relations have a progressively simpler and more regular structure (Date & Darwen, 2000). The reversibility guarantees that the original collection of relations can be recovered and therefore no information has been lost. Codd proposed three normal forms which he called first normal form (1NF), second normal form (2NF) and third normal form (3NF). A stronger definition of 3NF was proposed by Boyce and Codd and is known as Boyce-Codd Normal Form (BCNF). All these normal forms except 1NF are based on the functional dependencies among the attributes of a relation (Elmasri & Navathe, 1994). First normal form relates to the structure of the relation. It requires that every attribute of a relation be based on a simple domain. The database designers have no problem to know if a relation violates first normal form. They can put the relation into first normal form algorithmically by replacing a non-simple domain by its constituent simple domains. In the second (2NF), third (3NF) and Boyce Codd normal form (BCNF), there is a need for the database designers to know the real meaning and application of database keys such as candidate key, primary key, super key, etc,. Problem Statement Database designers always find it difficult to determine these keys from relational database schemas. It has been difficult to motivate students and database designers to derive primary, candidate, alternative and super keys because they think this area is dry and theoretical. There are many algorithms to determine the database keys but they look abstract for students. Many database researchers indicated that relational database model to derive database keys tends to be complex for the average designers. Failure to determine the database keys at times leads to poor design that can generate database anomalies. The database key algorithms often require extensive relational algebraic backgrounds that database designers lack. …
computer science and electronic engineering conference | 2010
Aderonke Justina Ikuomola; Adesina Simon Sodiya; Joshua Ojo Nehinbe
In recent years, the security issues on computer networks have become one of the primary concerns because the pervasiveness of computer technology has rendered computer networks to be more vulnerable to attacks than ever before. It is necessary to take corrective actions in the process of detecting surreptitious attacks to ensure proper safety of the entire system. However, effective post-intrusive actions depend on integrated efforts of experts and systems and a thorough understanding of intrusion detection processes. Therefore, this work aims to strike a balance between damages made by the intrusion and cost of response. Cost-Sensitive modeling approach will be used so as to define the cost factors involved. The prototype of the design is briefly discussed.
computer and information technology | 2006
Adesina Simon Sodiya
Asian Journal of Information Technology | 2011
Adewale Opeoluwa Ogunde; Olusegun Folorunso; Adesina Simon Sodiya; G.O. Ogunleye
International Journal of Network Security | 2011
Adesina Simon Sodiya; Olusegun Folorunso; Saidat Adebukola Onashoga; Omoniyi Paul Ogunderu