Journal of Artificial Intelligence and Big Data
Volume 4, Issue 2, 2024
Open Access November 16, 2024 8 pages 369 views 90 downloads

Digital Therapeutics: A New Dimension to Diabetes Mellitus Management

Journal of Artificial Intelligence and Big Data 2024, 4(2), 1090. DOI: 10.31586/jaibd.2024.1090
Abstract
Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle
[...] Read more.
Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabetes control. The importance of DTx lies in their ability to make diabetes care more accessible and convenient. Mobile apps and telemedicine platforms enable patients to receive support and guidance from anywhere, reducing the need for frequent in-person visits. Additionally, DTx often include behavioral support features like reminders, educational content, and motivational tools, which are crucial for maintaining healthy habits and managing stress. Currently, the dynamics of DTx in diabetes are rapidly evolving, with increasing integration of artificial intelligence and machine learning to further personalize and optimize care. As the adoption of these technologies grows, they hold the potential to significantly improve patient outcomes and revolutionize diabetes management on a global scale. This article will focus on the benefits of novel digital therapeutics for prevention and management of type II diabetes that are currently available in the market.Full article
Article
Open Access June 28, 2024 13 pages 703 views 140 downloads

Nigeria Exchange Rate Volatility: A Comparative Study of Recurrent Neural Network LSTM and Exponential Generalized Autoregressive Conditional Heteroskedasticity Models

Journal of Artificial Intelligence and Big Data 2024, 4(2), 983. DOI: 10.31586/jaibd.2024.983
Abstract
Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long
[...] Read more.
Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long short term memory (LSTM) model with the combinations of p = 10 and q = 1 layers to model the volatility of Nigerian exchange rates. Our goal is to determine the preferred model for predicting Nigeria’s Naira exchange rate volatility with Euro, Pounds and US Dollars. The dataset of monthly exchange rates of the Nigerian Naira to US dollar, Euro and Pound Sterling for the period December 2001 – August 2023 was extracted from the Central Bank of Nigeria Statistical Bulletin. The model efficiency and performance was measured with the Mean Squared Error (MSE) criteria. The results indicated that the Nigeria exchange rate volatility is asymmetric, and leverage effects are evident in the results of the EGARCH (1, 1) model. It was observed also that there is a steady increase in the Nigeria Naira exchange rate with the euro, pounds sterling and US dollar from 2016 to its highest peak in 2023. Result of the comparative analysis indicated that, EGARCH (1,1) performed better than the LSTM model because it provided a smaller MSE values of 224.7, 231.3 and 138.5 for euros, pounds sterling and US Dollars respectively.Full article
Article
Open Access December 17, 2024 14 pages 429 views 78 downloads

Disaster Recovery and Application Security in Microservices: Exploring Kubernetes, Application Gateways, and Cloud Solutions for High Availability

Journal of Artificial Intelligence and Big Data 2024, 4(2), 1209. DOI: 10.31586/jaibd.2024.1209
Abstract
Unfortunately, it is not disaster recovery, high availability, or cloud technologies that are inherently difficult to understand, but rather the action of implementing them for software applications that is difficult. The unique method of implementation for a microservices architecture is explored. Regulatory compliance doesn’t stop just because an effective disaster recovery requirement is tough
[...] Read more.
Unfortunately, it is not disaster recovery, high availability, or cloud technologies that are inherently difficult to understand, but rather the action of implementing them for software applications that is difficult. The unique method of implementation for a microservices architecture is explored. Regulatory compliance doesn’t stop just because an effective disaster recovery requirement is tough to satisfy for infrastructure unique to sleek microservices. The high-availability location transparency bliss offered by a cloud solution is appealing to a security engineering department. However, the headache starts when the technology presents a handful of undesirable surprises that leak RESTful microservices to the outside world. These are the challenges that post-SOA cloud-resident robustly scalable applications will need to address and overcome. The goal is to explore several popular methods of accomplishing these tough objectives so that engineers can further research the most practical solution. An innovative implementation that leverages Service Bus relays as an elegant disaster recovery solution while enforcing a strict subnet where RESTful microservices solely live will be discussed. The curiosity lies in the atypical experimentation beyond basic gateways and the facility of using such simplicity while still answering day-to-day software development infrastructure challenges for applications we build. Resilient full-service web proxy service crashes and delivery latency switches by harnessing the microservices pod health will also be discussed [1].Full article
Review Article
Open Access December 17, 2024 13 pages 148 views 56 downloads

An Analysis of Performance and Comparison of Models for Cardiovascular Disease Prediction via Machine Learning Models in Healthcare

Journal of Artificial Intelligence and Big Data 2024, 4(2), 1332. DOI: 10.31586/jaibd.2024.1332
Abstract
Over the past few decades, cardiovascular disease and related complications have surpassed all others as the important causes of death on a universal scale. At the moment, they are the important cause of mortality universal, including in India. It is important to know how to find cardiovascular problems early so that patients get better care and prices go down. This project utilizes the UCI Heart
[...] Read more.
Over the past few decades, cardiovascular disease and related complications have surpassed all others as the important causes of death on a universal scale. At the moment, they are the important cause of mortality universal, including in India. It is important to know how to find cardiovascular problems early so that patients get better care and prices go down. This project utilizes the UCI Heart Disease Dataset to develop ML and DL models capable of detecting cardiac diseases. Heart illness was categorized using Convolutional Neural Network (CNN) models, which are able to detect intricate patterns in supplied data. A confusion matrix rating, an F1-score, a ROC curve, accuracy, precision, and recall were some of the measures used to grade the model. It did much better than the Neural Network, Deep Neural Network (DNN), and Gradient Boosted Trees (GBT) models, with 91.71% accuracy, 88.88% precision, 82.75% memory, and 85.70% F1-score. Comparative study showed that CNN was the most accurate model. Other models had different balances between accuracy and recall. The experiment results show that the optional CNN model is a decent way to identify cardiovascular disease. This means that it could be used in healthcare systems to find diseases earlier and treat patients better.Full article
Article
Open Access December 19, 2024 11 pages 219 views 27 downloads

Intelligent Detection of Injection Attacks via SQL Based on Supervised Machine Learning Models for Enhancing Web Security

Journal of Artificial Intelligence and Big Data 2024, 4(2), 1333. DOI: 10.31586/jaibd.2024.1333
Abstract
The most prevalent technique behind security data breaches exists through SQL Injection Attacks. Organizations and individuals suffer from sensitive information exposure and unauthorized entry when attackers take advantage of SQL injection (SQLi) attack vulnerability’s severe risks. Static and heuristic defense methods remain conventional detection tools for previous SQL injection attacks study's
[...] Read more.
The most prevalent technique behind security data breaches exists through SQL Injection Attacks. Organizations and individuals suffer from sensitive information exposure and unauthorized entry when attackers take advantage of SQL injection (SQLi) attack vulnerability’s severe risks. Static and heuristic defense methods remain conventional detection tools for previous SQL injection attacks study's foundation is a detection system developed using the Gated Recurrent Unit (GRU) network, which attempts to efficiently identify SQL Injection attacks (SQLIAs). The suggested Gated Recurrent Unit model was trained using an 80:20 train-test split, and the results showed that SQL injection attacks could be accurately identified with a precision rate of 97%, an accuracy rate of 96.65%, a recall rate of 92.5%, and an F1-score of 94%. The experimental results, together with their corresponding confusion matrix analysis and learning curves, demonstrate resilience and outstanding generalization ability. The GRU model outperforms conventional machine learning (ML) models, including K-Nearest Neighbor’s (KNN), and Support Vector Machine (SVM), in terms of identifying sequential patterns in SQL query data. Recurrent neural architecture proves effective in the detection of SQLi attacks through its ability to provide secure protection for contemporary web applications.Full article
Article
Open Access December 20, 2024 12 pages 55 views 8 downloads

AI for Time Series and Anomaly Detection

Journal of Artificial Intelligence and Big Data 2024, 4(2), 1399. DOI: 10.31586/jaibd.2024.1399
Abstract
Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent
[...] Read more.
Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent advances in artificial intelligence particularly deep learning architectures such as recurrent neural networks (RNNs), convolutional neural networks (CNNs), temporal convolutional networks (TCNs), graph neural networks (GNNs) and Transformers have demonstrated marked improvements in modeling both univariate and multivariate series, as well as in detecting anomalies that deviate from learned norms (Darban, Webb, Pan, Aggarwal, & Salehi, 2022; Chiranjeevi, Ramya, Balaji, Shashank, & Reddy, 2024) [1,2]. Moreover, ensemble techniques and hybrid signal-processing + deep-learning pipelines show enhanced sensitivity and adaptability in real-world anomaly detection scenarios (Iqbal, Amin, Alsubaei, & Alzahrani, 2024) [3]. In this work, we provide a unified survey and comparative analysis of AI-driven time series forecasting and anomaly detection methods, highlight key industrial application domains, evaluate performance trade-offs (e.g., accuracy vs. latency, supervised vs. unsupervised learning), and discuss emerging challenges including interpretability, data drift, real-time deployment on edge devices, and integration of causal reasoning. Our findings suggest that while AI approaches significantly outperform classical techniques in many settings, careful consideration of data characteristics, evaluation metrics and deployment environment remains essential for effective adoption.Full article
Article
ISSN: 2771-2389
DOI prefix: 10.31586/jaibd
Journal metrics
Publication year
2016-2026
Journal (home page) visits
15510
Published articles
62
Article views
40687
Article downloads
6763
Downloads/article
109
APC
99.00