Department of Computer Science
Permanent URI for this communityhttps://repository.nileuniversity.edu.ng/handle/123456789/44
Browse
Item A Dynamic and Incremental Graphical Grid Authentication Technique for Mobile and Web Applications(2024-08-08) Gong Jiaming; Akande Oluwatobi Noah; Chia-Chen Lin; Agarwal SaurabhKnowledge-based authentication techniques remain one of the proven ways of maintaining confidentiality, ensuring integrity, and guaranteeing the availability of an information system. They employ what a user knows (Passwords or PINs) to authorize or grant access to an information system. While passwords employ a fixed combination of characters, Personal Identification Numbers (PINs) are majorly numbers. Existing implementations of these authentication techniques involve the repetitive use of static passwords and PINs at every login instance. These have been exposed to various attacks, such as keyloggers, shoulder surfing, brute force, and dictionary attacks. To overcome these attacks, this study presents an authentication technique where users’ PINs are incremented during successive login attempts. Users are expected to choose a preferred incremental factor, which can be any number they can remember, that will be added to the default 6-digit PIN to produce a dynamic PIN that can be used in subsequent login sessions. Furthermore, an additional layer of security that involves the use of a dynamic 4 by 4 graphical grid was integrated into the proposed incremented PIN technique. At every login session, users are presented with a set of 16 possible PINs to choose from. The security analysis of the proposed authentication technique revealed that the proposed technique could resist existing password attacks, thereby enhancing security. A performance testing and usability analysis was also carried out among 1145 individuals who interacted with the web application that uses the incremental authentication technique. The questionnaire items were structured based on the constructs of the Unified Theory of Acceptance and Use of Technology (UTAUT) Model. Statistical analysis of the responses received showed an appreciable level of acceptance in terms of performance expectancy, effort expectancy, social influence, and facilitating conditions. The positive user acceptance results provide reassurance about the practicality and effectiveness of the proposed technique. It is believed that the proposed incremental graphical grid authentication technique will further enhance the security of our growing mobile and web applicationsItem A Dynamic Round Triple Data Encryption Standard Cryptographic Technique for Data Security(Springer Nature Switzerland AG, 2020-08-08) Akande Oluwatobi Noah; Abikoye Oluwakemi Christiana; Kayode Aderonke Anthonia; Aro Oladele Taye; Ogundokun Oluwaseun RoselineCryptographic techniques have been widely employed to protect sensitive data from unauthorized access and manipulation. Among these cryptographic techniques, Data Encryption Standard (DES) has been widely employed, however, it suffers from key and differential attacks. To overcome these attacks, several DES modifications have been proposed in literatures. Most modifications have focused on enhancing DES encryption key; however, the strength of a cryptographic technique is determined by the encryption key used and the number of encryption rounds. It is a known fact that Advanced Encryption Standard (AES) cryptographic technique with 14 encryption rounds is stronger than AES with 12 rounds while AES with 12 rounds is stronger than AES with 10 rounds. Therefore, this study proposed a DES cryptographic technique whose number of rounds is dynamic. Users are expected to specify the number of encryption and decryption rounds to be employed at run time. Moreover, a predefined number of shifting operations which is left circular shift 2 was chosen for each encryption round. As, a trade-off in complexity, the number of Substitution box (S-box) was also reduced to 4, so that the input to the S-boxes would be arranged in four 12-bit blocks for the X-OR operation and not six 8-bit blocks as in the traditional DES. Finally, three keys were used to encrypt, decrypt and encrypt the plaintext ciphertext as in triple DES. The modified DES yielded a better avalanche effect for rounds greater than 16 though its encryption and decryption time were greater than that of the traditional DES.Item A Few-shot custom CNN Model for Retinal Nerve Fibre Layer Thickness Measurement in OCT Images of Epilepsy(Proc. of International Conference on Artificial Intelligence, Computer, Data Sciences and Applications (, 2024-02-01) Ruqayya Muhammad; Moussa Mahamat Boukar; Steve Adeshina; Senol DaneThis study aims to assess the effectiveness of employing deep learning models for measuring retinal nerve fiber layer (RNFL) thickness in optical coherence tomography (OCT) scans of epilepsy patients. Conventional OCT scan segmentation methods typically rely on supervised learning, demanding substantial data for training and assuming fixed network weights post-training. To mitigate these challenges, we explore the applicability of few-shot learning (FSL) in CNN architectures, allowing dynamic fine-tuning of network weights with minimal additional data. Experimental results demonstrate enhanced segmentation accuracy, with the proposed Few shot Custom CNN achieving a notable 91% accuracy, surpassing both the Custom CNN (86%) and the OCT machine data. This suggests the superiority of the few-shot Custom CNN model in segmentation performance compared to OCT scans.Item A framework for Poultry weather control with IoT in sub-Saharan Africa(15th International Conference on Electronics Computer and Computation (ICECCO 2019), 2019-02-02) Nasiru Afeez; Steve Adeshina; Abdullahi Inci; Moussa Mahamat BoukarPoultry farming in the sub-Saharan region of Africa is fraught with a lot of challenges among which are high temperature and humidity. In this paper, the authors proposed an Internet of Things (IoT) framework that will help in regulating the various climatic conditions that will help at providing a high yield of poultry products. This framework is aimed at providing proactive and preventive ways to avert or reduce the high mortality rate in a flock of birds as a result of heat stress. IoT which is a connected environment of monitoring sensors with high precision and an accurate decision taken would be presented in managing environmental conditions of poultry house that will gather information, analyze it and effect an action based on the predetermined weather conditions that are suitable for bird’s existenceItem A Machine Learning Led Investigation Predicting the Thermos‑mechanical Properties of Novel Waste‑based Composite in Construction(Waste and Biomass Valorization, 2024-05-04) Assia Aboubakar Mahamat; Moussa Mahamat Boukar; Ifeyinwa Ijeoma Obianyo; Nurudeen M. IbrahimThe study explores the potential of machine learning (ML) in predicting the thermal and mechanical properties of earth-based composites reinforced with natural Borassus fruit fiber. The limited availability of large datasets for accurate predictions is a challenge in material science research, which this study addresses. The authors collected data on thermal conductivity, compressive and flexural strength through experiments and employed four ML techniques suitable for small datasets: linear regression (LR), random forest (RF), decision tree regressor (DTR), and gradient boosting (GB). Evaluation metrics were used to assess the performance of the ML techniques. Linear regression emerged as the most efficient, exhibiting significantly lower error values compared to the others (e.g., RMSE of 0.066 for thermal conductivity, 0.119 for compressive strength, and 0.04 for flexural strength), followed by random forest and decision tree. However, gradient boosting showed relatively poor predictive accuracy. This study demonstrates the successful application of ML for predicting the properties of earth-based composites with limited data, which could significantly reduce the cost and time associated with developing new building materials and products. Manufacturers can gain a competitive edge by using ML to streamline material development, leading to lower costs, faster innovation, and the creation of more environmentally friendly building materials for a greener construction sector.Item A Multi-Indexes Based Technique for Resolving Collision in a Hash Table(IJCSNS International Journal of Computer Science and Network Security, 2021-09-20) Saleh Abdullahi; Moussa Mahamat BoukarThe rapid development of various applications in networking system, business, medical, education, and other domains that use basic data access operations such as insert, edit, delete and search makes data structure venerable and crucial in providing an efficient method for day to day operations of those numerous applications. One of the major problems of those applications is achieving constant time to search a key from a collection. A number of different methods which attempt to achieve that have been discovered by researchers over the years with different performance behaviors. This work evaluated these methods, and found out that almost all the existing methods have non-constant time for adding and searching a key. In this work, we designed a multi-indexes hashing algorithm that handles a collision in a hash table T efficiently and achieved constant time O(1) for searching and adding a key. Our method employed two-level of hashing which uses pattern extraction h1(key) and h2(key). The second hash function h2(key) is use for handling collision in T. Here, we eliminated the wasted slots in the search space T which is another problem associated with the existing methodsItem A novel technique to prevent SQL injection and cross-site scripting attacks using Knuth-Morris-Pratt string match algorithm(Springer Open, 2020-08-08) Abikoye Oluwakemi Christiana; Abubakar Abdullahi; Dokoro Ahmed Haruna; Akande Oluwatobi Noah; Kayode Aderonke AnthoniaStructured Query Language (SQL) injection and cross-site scripting remain a major threat to data-driven web applications. Instances where hackers obtain unrestricted access to back-end database of web applications so as to steal, edit, and destroy confidential data are increasing. Therefore, measures must be put in place to curtail the growing threats of SQL injection and XSS attacks. This study presents a technique for detecting and preventing these threats using Knuth-Morris-Pratt (KMP) string matching algorithm. The algorithm was used to match user’s input string with the stored pattern of the injection string in order to detect any malicious code. The implementation was carried out using PHP scripting language and Apache XAMPP Server. The security level of the technique was measured using different test cases of SQL injection, cross-site scripting (XSS), and encoded injection attacks. Results obtained revealed that the proposed technique was able to successfully detect and prevent the attacks, log the attack entry in the database, block the system using its mac address, and also generate a warning message. Therefore, the proposed technique proved to be more effective in detecting and preventing SQL injection and XSS attacksItem A Review of Fraudulent Practices in Healthcare Insurance and Machine Learning-Based Investigation Approaches(IEEE, 2023-02-02) Aishat Salau; Nwojo Agwu Nnanna; Moussa Mahamat BoukarHealthcare insurance fraud is a complex and costly problem that has become a concern globally. Traditional methods of detecting fraudulent claims and requests are time-consuming and often ineffective. Machine learning methods offer potential solutions to this problem by improving fraud investigation and prevention in health insurance systems. This paper presents a comprehensive review of machine learning-based approaches for addressing healthcare insurance fraud, as well as associated challenges and limitations. Despite limitations, our findings suggest that fraud could be effectively tackled by addressing the challenges identified. Areas for further research were also highlighted.Item Advancing Preauthorization Task in Healthcare: An Application of Deep Active Incremental Learning for Medical Text Classification(Engineering, Technology & Applied Science Research, 2023-09-29) Nnanna Agwu Nwojo; Moussa Mahamat BoukarThis study presents a novel approach to medical text classification using a deep active incremental learning model, aiming to improve the automation of the preauthorization process in medical health insurance. By automating decision-making for request approval or denial through text classification techniques, the primary focus is on real-time prediction, utilization of limited labeled data, and continuous model improvement. The proposed approach combines a Bidirectional Long Short-Term Memory (Bi-LSTM) neural network with active learning, using uncertainty sampling to facilitate expert-based sample selection and online learning for continuous updates. The proposed model demonstrates improved predictive accuracy over a baseline Long Short-Term Memory (LSTM) model. Through active learning iterations, the proposed model achieved a 4% improvement in balanced accuracy over 100 iterations, underscoring its efficiency in continuous refinement using limited labeled data.Item Age Estimation from Facial Images Using Custom Convolutional Neural Network (CNN)(International Conference on Frontiers in Academic Research, 2023-02-23) Gilbert George; Steve Adeshina; Moussa Mahamat BoukarGiven that aging is influenced by a variety of factors, including gender, ethnicity, environment, and others, automatic age assessment of facial images is a difficult challenge in computer vision and image analysis. Additionally, a significant amount of data and a laborious training phase are needed to estimate age from facial photos with near accuracy. In this study, we present a custom convolutional neural network-based age estimator that can almost precisely predict age from facial photos. We use the UTK facial image dataset using about 17475 images. We train the model to group the facial images into three groups which are; Child, Teenager and Adult. Compared to similar efforts, our method uses less training data while maintaining a high accuracy of 95%.Item An Error Analysis Algorithm for Approximate Solution of Linear Fredholm-Stieltjes Integral Equations with Generalized Trapezium Method(IEEE, 2017-02-02) Moussa Mahamat BoukarIntegral equations and their solutions are very important for various areas like physics, engineering, biology and other. Fredholm-Stieltjes integral equations are some of the integral equations. Sometimes it is possible to find exact solutions for some of the integral equations.The main purpose of this paper is to propose an error analysis algorithm for approximate solution of linear Fredholm-Stieltjes integral equations of second kind with Generalized Trapezium Method. Firstly, the theory of error analysis is given. Then the implementation of algorithm is done with Maple software and examples are given with graphics.Item An Interactive Application (Maplet) for II-Order Ordinary Differential Equations(IEEE, 2014-02-02) Moussa Mahamat BoukarThe main purpose of this paper is to propose a Maplet interactive application that is used to find general solutions, to find Initial Value Problems (IVPs) and to depict 2-D and 3-D graphics as well of the II-Order Ordinary Homogeneousl Non-homogeneous Differential Equations (ODEs). Furthermore, to make 2-D, 3-D graphing of solutions and how they can be used as effective educational tools for both students and instructors.Item Analysis of Bad Roads Using Smart phone(IEEE, 2019-02-02) Moussa Mahamat Boukar; Steve AdeshinaDeveloping nations are faced with a lot of bad roads with potholes of different debt ranges, the maintenance and rehabilitation process by government agencies is an ongoing effort that requires periodic bad road inventory to guarantee safety. Bad roads are either identified by government agency’s survey teams or individual who volunteer to report these conditions to the authorities. Our research provided a simple but effective solution to aid in automatically reporting bad roads using smart-phones through measuring the pavement profile based on the vibration of a moving vehicle. In this article, we will explain how we used some a smart-phone in reading the vibration pattern, GPS location, speed and direction of a vehicle that drives through a pothole, these parameters are periodically streamed to a cloud application. We used standard deviation to measure the level of dispersion around a segmented set of streamed vehicle vibration to identify potholes of different sizes, we also used Artificial Intelligence - supervised learning algorithm (classification) to reduce the false positive error rates due to human behaviors. The final results show a distinct vibration levels between small pot-holes, speed bumps and big pot-holes, these values are displayed on map application to visualize the geographical locations of these pot-holes (Google maps)Item Analysis of Prostate Cancer DNA Sequences Using Bi-direction Long Short Term Memory Model(IEEE, 2021-02-02) Yusuf Aleshinloye Abass; Steve Adeshina; Nwojo Nnana Agwu; Moussa Mahamat BoukarMachine and deep learning-based models are the emerging techniques in addressing prediction problems in biomedical data analysis. DNA sequence prediction is a critical problem that requires huge attention in the biomedical domain. These techniques have been shown to provide better accurate results when compared to the conventional regression-based models. Prediction of the gene sequence that leads to cancerous diseases such as prostate cancer is very crucial. Identifying the most important features in a gene sequence is one of the most challenging tasks and extracting the components of the gene sequence that can give an insight into the kind of mutation in the gene is very important, it will lead to effective drug design and promote the new concept of personalized medicine. In this work we have extracted the exons in the various prostate gene sequence that was used in the experiment, we built a bi-LSTM model using a k-mer encoding for the DNA sequence and one- hot encoding for the class label. The bi-LSTM model was evaluated on different classification metrics. Our experimental results show that the model prediction offers a training accuracy and validation accuracy of 95 percent and 91 percent respectively.Item BASIC DEPENDENCY PARSING IN NATURAL LANGUAGE INFERENCE(IEEE, 2017-02-02) Aleshinloye Abass Yusuf; Nnanna Agwu Nwojo; Moussa Mahamat BoukarParsing is the process of analyzing a sentence for it structure, content and meaning, this process uncover the structure, articulate the constituents and the relation between the constituents of the input sentence. This paper described the importance of parsing strategy in achieving entailment in natural language inference. Parsing is the basic task in processing natural language and it is also the basis for all natural language applications such as machine learning, question answering and information retrieval. We have used the parsing strategy in natural language inference to achieve entailment through an approach called normalization approach where entailment is achieved by removing or replacing some nodes as well as relations in a tree. This process requires a detailed understanding of the dependency structure, in order to generate a tree that does not contain nodes and relations that are irrelevant to the inference procedure. In order to achieve this, the dependency trees are transformed by applying some rewrite rules to the dependency treeItem Collision Resolution Techniques in Hash Table(2021-02-02) Moussa Mahamat BoukarItem Comparison of Transfer Learning Model Accuracy for Osteoporosis Classification on Knee Radiograph(IEEE, 2022-02-02) Moussa Mahamat Boukar; Steve AdeshinaIn terms of financial costs and human suffering, osteoporosis poses a serious public health burden. Reduced bone mass, degeneration of the micro architecture of bone tissue, and an increased risk of fracture are its main skeletal symptoms. Osteoporosis is caused not just by low bone mineral density, but also by other factors such as age, weight, height, and lifestyle. Recent advancement in Artificial Intelligence (AI) has led to successful applications of expert systems that use Deep Learning techniques for osteoporosis diagnosis based on some modalities such as dental radiographs amongst others. This study uses a dataset of knee radiographs (i.e., knee-Xray images) to apply and compare three robust transfer learning model algorithms: GoogLeNet, VGG-16, and ResNet50 to classify osteoporosis. From the statistical analysis and scikit learn python analysis, the accuracy of the GoogLeNet model was 90%, the accuracy of the VGG-16 model was 87% and lastly, the accuracy of the ResNet- 50 model was 83%.Item Comprehensive Evaluation Of Appearance-Based Techniques For Palmprint Features Extraction Using Probabilistic Neural Network, Cosine Measures And Euclidean Distance Classifiers(UNIVERSITY OF PITESTI SCIENTIFIC BULLETIN, 2018-08-08) Akande Oluwatobi Noah; Abikoye O. C; Adeyemo I. A; Ogundokun R. O; Aro T. OMost biometric systems work by comparing features extracted from a query biometric trait with those extracted from a stored biometric trait. Therefore, to a great extent, the accuracy of any biometric system is dependent on the effectiveness of its features extraction stage. With an intention to establish a suitable appearance based features extraction technique, an independent comparative study of Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA) and Principal Component Analysis (PCA) algorithms for palmprint features extraction is reported in this article. Euclidean distance, Probabilistic Neural Network (PNN) and cosine measures were used as classifiers. Results obtained revealed that cosine metrics is preferable for ICA features extraction while PNN is preferable for LDA features extraction. Both PNN and Euclidean distance yielded a better recognition rate for PCA. However, ICA yielded the best recognition rate in terms of FAR and FRR followed by LDA then PCAItem Content Management System (CMS) Evaluation and Analysis(Journal of Technical Science and Technologies, 2012-02-02) Moussa Mahamat BoukarContent management systems (CMS) provide an optimal solution by organizing information and, mostly, creating and managing an enterprise’s knowledge. Nevertheless there is a big confusion about the functionalities that characterize CMS and about the differences with less performing products such as web content management systems, document and records management systems and enterprise content management systems. This paper aims to show the mismatches between companies’ needs and those information management products, which are often called CMS even if they are not. For this reason I first made a theoretical comparison between the functionalities of CMS and those of the systems that are often confused with. Then I showed the results of an empirical research on 22 products offered by international vendors. By using an original scheme, enterprises’ needs in terms of information collection, management and publication of knowledge management are compared with the functionalities of the aforementioned systems. The result consists of performing definitions for CMS and the other systems for managing information. Content Management products are analyzed, compared and evaluated by using a special table created to point out the actual functionalities of the products offered on the market, despite vendors’ declarations. The paper conclusions show how, on the demand side, companies’ needs are growing in a confused framework; at the same time the supply side keeps on feeding this confusion, reducing company satisfaction in regard to knowledge and information managementItem Data Dissemination via web Services for Distributed and Heterogeneous Data sources: An Enhancement of the Nigerian University Certificate Verification System(IEEE, 2017-02-02) Salisu Ibrahim Yusuf; Moussa Mahamat BoukarHarmonization of academic records between institutions will ease information sharing among institutions and reduce forgery of certifications and other academic qualifications. A solutions was proposed which collect relevant certificate information from Nigerian Universities’ databases via web service and make it publically available across all platforms via web service as a means for verifying certificate authenticity. One of the limitations of the proposed system is the limitation imposed on the data that can be retrieved from institutions by the defines JSON template, more relevant data might be neglected, also it was assumed that all universities use relational database, with the current trend it is possible in the nearest future a good number of institutions might move to NoSQL platform. In this study we proposed an enhancement of the initially proposed system to accommodate diversity of data and databases provided by institutions by using NoSQL platform and allowing institutions modify the template for the web service they will share their data, this improves the parsing time as data will not need to be structured as relational database. Hence an enhancement of the Nigerian Universities’ Certificate Verification was proposed.
- «
- 1 (current)
- 2
- 3
- »