<?xml version="1.0" encoding="UTF-8"?>
<rdf:RDF 
    xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" 
    xmlns="http://purl.org/rss/1.0/" 
    xmlns:dc="http://purl.org/dc/elements/1.1/" 
    xmlns:dcterms="http://purl.org/dc/terms/" 
    xmlns:cc="http://web.resource.org/cc/" 
    xmlns:prism="http://prismstandard.org/namespaces/basic/2.0/" 
    xmlns:admin="http://webns.net/mvcb/" 
    xmlns:content="http://purl.org/rss/1.0/modules/content/">
    
    <channel rdf:about="https://www.scipublications.com/journal/jaibd/rss">
        <title>Journal of Artificial Intelligence and Big Data</title>
        <link>https://www.scipublications.com/journal/jaibd</link>
        <description>Journal of Artificial Intelligence and Big Data - A technical journal publishing research on machine learning algorithms, deep learning architectures, big data analytics, data mining, AI applications, and intelligent systems design.</description>
        <language>en</language>
        <copyright>Copyright 2026 Journal of Artificial Intelligence and Big Data</copyright>
        <pubDate>Tue, 28 Apr 2026 12:37:57 GMT</pubDate>
        <lastBuildDate>Tue, 28 Apr 2026 12:37:57 GMT</lastBuildDate>
        <generator>Scientific Publications</generator>
        <ttl>60</ttl>
        <prism:eIssn>2771-2389</prism:eIssn>
        <items>
            <rdf:Seq>
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/predictive-modeling-of-public-sentiment-using-social-media-data-and-natural-language-processing-techniques-6162" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/development-of-a-hemodialysis-data-collection-and-clinical-information-system-and-establishment-of-an-intradialytic-blood-pressure/pulse-rate-predictive-model-6029" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/enhancing-scalability-and-performance-in-analytics-data-acquisition-through-spark-parallelism-6049" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/the-future-of-longevity-medicine-from-the-lens-of-digital-therapeutics-1265" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/tech-transformations:-modern-solutions-for-obstructive-sleep-apnea-1248" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/artificial-immune-systems:-a-bio-inspired-paradigm-for-computational-intelligence-1233" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/ai-for-time-series-and-anomaly-detection-1399" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/intelligent-detection-of-injection-attacks-via-sql-based-on-supervised-machine-learning-models-for-enhancing-web-security-1333" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/an-analysis-of-performance-and-comparison-of-models-for-cardiovascular-disease-prediction-via-machine-learning-models-in-healthcare-1332" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/disaster-recovery-and-application-security-in-microservices:-exploring-kubernetes-application-gateways-and-cloud-solutions-for-high-availability-1209" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/digital-therapeutics:-a-new-dimension-to-diabetes-mellitus-management-1090" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/ai-powered-optimization-for-high-performance-computing-in-scientific-simulations-1695" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/nigeria-exchange-rate-volatility:-a-comparative-study-of-recurrent-neural-network-lstm-and-exponential-generalized-autoregressive-conditional-heteroskedasticity-models-983" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/5v’s-of-big-data-shifted-to-suite-the-context-of-software-code:-big-code-for-big-software-projects-911" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/the-use-of-contemporary-enterprise-resource-planning-(erp)-technologies-for-digital-transformation-881" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/an-appraisal-of-challenges-in-developing-information-literacy-skills-in-the-colleges-of-education-of-ghana-878" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/universal-evaluation-of-sap-s/4-hana-erp-cloud-system-882" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/stock-closing-price-and-trend-prediction-with-lstm-rnn-877" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/understanding-the-fundamentals-of-digital-transformation-in-financial-services:-drivers-and-strategic-insights-1216" />
                <rdf:li rdf:resource="https://www.scipublications.com/journal/jaibd/article/networking-solutions-for-large-scale-iot-deployments:-architectures-challenges-and-trends-1382" />
            </rdf:Seq>
        </items>
    </channel>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/predictive-modeling-of-public-sentiment-using-social-media-data-and-natural-language-processing-techniques-6162">
        <title>Predictive Modeling of Public Sentiment Using Social Media Data and Natural Language Processing Techniques</title>
        <link>https://www.scipublications.com/journal/jaibd/article/predictive-modeling-of-public-sentiment-using-social-media-data-and-natural-language-processing-techniques-6162</link>
        <description>Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled tweets, and develops predictive models for binary sentiment classification using Naive Bayes, Logistic Re...</description>
        <dc:creator>Lawrence A. Farinola, Jean-Eudes Assogba</dc:creator>
        <dc:date>2026-02-05</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2026.6162</dc:identifier>
        <pubDate>Thu, 05 Feb 2026 16:00:00 GMT</pubDate>
        <dc:subject>Sentiment Analysis; Social Media Mining; Public Opinion Prediction; Natural Language Processing; BERT Transformer Model</dc:subject>
        <prism:volume>6</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>1</prism:startingPage>
        <prism:endingPage>12</prism:endingPage>
        <prism:doi>10.31586/jaibd.2026.6162</prism:doi>
        <dcterms:abstract>Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled tweets, and develops predictive models for binary sentiment classification using Naive Bayes, Logistic Regression, and the transformer-based BERT model. Experiments were conducted on a balanced subset of 12,000 tweets after comprehensive NLP preprocessing. Evaluation using accuracy, F1-score, and confusion matrices revealed that BERT significantly outperforms traditional models, achieving an accuracy of 89.5% and an F1-score of 0.89 by effectively modeling contextual and semantic nuances. In contrast, Naive Bayes and Logistic Regression demonstrated reasonable but consistently lower performance. To support practical deployment, we introduce SentiFeel, an interactive tool enabling real-time sentiment analysis. While resource constraints limited the dataset size and training epochs, future work will explore full corpus utilization and the inclusion of neutral sentiment classes. These findings underscore the potential of transformer models for enhanced public opinion monitoring, marketing analytics, and policy forecasting.</dcterms:abstract>
        <dcterms:issued>2026-02-05</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Predictive Modeling of Public Sentiment Using Social Media Data and Natural Language Processing Techniques</h2>
    <p class="authors">Lawrence A. Farinola, Jean-Eudes Assogba</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - February 05, 2026</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled tweets, and develops predictive models for binary sentiment classification using Naive Bayes, Logistic Regression, and the transformer-based BERT model. Experiments were conducted on a balanced subset of 12,000 tweets after comprehensive NLP preprocessing. Evaluation using accuracy, F1-score, and confusion matrices revealed that BERT significantly outperforms traditional models, achieving an accuracy of 89.5% and an F1-score of 0.89 by effectively modeling contextual and semantic nuances. In contrast, Naive Bayes and Logistic Regression demonstrated reasonable but consistently lower performance. To support practical deployment, we introduce SentiFeel, an interactive tool enabling real-time sentiment analysis. While resource constraints limited the dataset size and training epochs, future work will explore full corpus utilization and the inclusion of neutral sentiment classes. These findings underscore the potential of transformer models for enhanced public opinion monitoring, marketing analytics, and policy forecasting.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/6162/952">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/development-of-a-hemodialysis-data-collection-and-clinical-information-system-and-establishment-of-an-intradialytic-blood-pressure/pulse-rate-predictive-model-6029">
        <title>Development of a Hemodialysis Data Collection and Clinical Information System and Establishment of an Intradialytic Blood Pressure/Pulse Rate Predictive Model</title>
        <link>https://www.scipublications.com/journal/jaibd/article/development-of-a-hemodialysis-data-collection-and-clinical-information-system-and-establishment-of-an-intradialytic-blood-pressure/pulse-rate-predictive-model-6029</link>
        <description>This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and ...</description>
        <dc:creator>I-Hsuan Peng, Chen-Kang Tien, Pei-Chun Lee</dc:creator>
        <dc:date>2025-06-27</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2025.6029</dc:identifier>
        <pubDate>Fri, 27 Jun 2025 16:00:00 GMT</pubDate>
        <dc:subject>Artificial Intelligence (AI); Artificial Neural Networks (ANNs); Chronic Kidney Disease (CKD); Deep Learning; Hemodialysis; Internet of Things (IoT); Intradialytic; Long Short-Term Memory (LSTM); Medical Informatics; Predictive Modeling</dc:subject>
        <prism:volume>5</prism:volume>
        <prism:issue>2</prism:issue>
        <prism:startingPage>1</prism:startingPage>
        <prism:endingPage>23</prism:endingPage>
        <prism:doi>10.31586/jaibd.2025.6029</prism:doi>
        <dcterms:abstract>This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models, with subsequent suggestions provided. Both objectives were executed under the supervision of the Institutional Review Board (IRB) at Mackay Memorial Hospital in Taiwan. The system completed for objective one has introduced three significant services to the clinic, including automated hemodialysis data collection, digitized data storage, and an information-rich human-machine interface as well as graphical data displays, which replaces traditional paper-based clinical administrative operations, thereby enhancing healthcare efficiency. The graphical data presented through web and app interfaces aids in real-time, intuitive comprehension of the patients’ conditions during hemodialysis. Moreover, the data stored in the backend database is available for physicians to conduct relevant analyses, unearth insights into medical practices, and provide precise medical care for individual patients. The training and evaluation of the predictive models for objective two, along with related comparisons, analyses, and recommendations, suggest that in situations with limited computational resources and data, an Artificial Neural Network (ANN) model with six hidden layers, SELU activation function, and a focus on artery-related features can be employed for hourly intradialytic BP/PR prediction tasks. It is believed that this contributes to the collaborating clinic and relevant research communities.</dcterms:abstract>
        <dcterms:issued>2025-06-27</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Development of a Hemodialysis Data Collection and Clinical Information System and Establishment of an Intradialytic Blood Pressure/Pulse Rate Predictive Model</h2>
    <p class="authors">I-Hsuan Peng, Chen-Kang Tien, Pei-Chun Lee</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - June 27, 2025</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an &lt;i&gt;Internet of Things (IoT)&lt;/i&gt;&lt;i&gt;-based&lt;/i&gt;&lt;i&gt; Information System&lt;/i&gt; customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare &lt;i&gt;Deep Learning-based Intradialytic Blood Pressure&lt;/i&gt;&lt;i&gt; (BP)&lt;/i&gt;&lt;i&gt;/Pulse &lt;/i&gt;&lt;i&gt;Rate (PR) &lt;/i&gt;&lt;i&gt;Predictive Models&lt;/i&gt;, with subsequent suggestions provided. Both objectives were executed under the supervision of the Institutional Review Board (IRB) at Mackay Memorial Hospital in Taiwan. The system completed for objective one has introduced three significant services to the clinic, including automated hemodialysis data collection, digitized data storage, and an information-rich human-machine interface as well as graphical data displays, which replaces traditional paper-based clinical administrative operations, thereby enhancing healthcare efficiency. The graphical data presented through web and app interfaces aids in real-time, intuitive comprehension of the patients’ conditions during hemodialysis. Moreover, the data stored in the backend database is available for physicians to conduct relevant analyses, unearth insights into medical practices, and provide precise medical care for individual patients. The training and evaluation of the predictive models for objective two, along with related comparisons, analyses, and recommendations, suggest that in situations with limited computational resources and data, an Artificial Neural Network (ANN) model with six hidden layers, SELU activation function, and a focus on artery-related features can be employed for hourly intradialytic BP/PR prediction tasks. It is believed that this contributes to the collaborating clinic and relevant research communities.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/6029/878">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/enhancing-scalability-and-performance-in-analytics-data-acquisition-through-spark-parallelism-6049">
        <title>Enhancing Scalability and Performance in Analytics Data Acquisition through Spark Parallelism</title>
        <link>https://www.scipublications.com/journal/jaibd/article/enhancing-scalability-and-performance-in-analytics-data-acquisition-through-spark-parallelism-6049</link>
        <description>Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale REST API calls, enabling enhanced scalability and improved processing speeds to meet the demands of hi...</description>
        <dc:creator>Hanza Parayil Salim, Yanas Rajindran</dc:creator>
        <dc:date>2025-03-21</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2025.6049</dc:identifier>
        <pubDate>Fri, 21 Mar 2025 16:00:00 GMT</pubDate>
        <dc:subject>Distributed computing</dc:subject>
        <dc:subject>Parallel processing</dc:subject>
        <dc:subject>Data Acquisition</dc:subject>
        <dc:subject>Apache Spark</dc:subject>
        <dc:subject>RESTful Web Services</dc:subject>
        <dc:subject>REST API</dc:subject>
        <dc:subject>Data Analytics</dc:subject>
        <prism:volume>5</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>38</prism:startingPage>
        <prism:endingPage>43</prism:endingPage>
        <prism:doi>10.31586/jaibd.2025.6049</prism:doi>
        <dcterms:abstract>Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale REST API calls, enabling enhanced scalability and improved processing speeds to meet the demands of high volume data workflows.</dcterms:abstract>
        <dcterms:issued>2025-03-21</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Enhancing Scalability and Performance in Analytics Data Acquisition through Spark Parallelism</h2>
    <p class="authors">Hanza Parayil Salim, Yanas Rajindran</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - March 21, 2025</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale REST API calls, enabling enhanced scalability and improved processing speeds to meet the demands of high volume data workflows.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/6049/822">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/the-future-of-longevity-medicine-from-the-lens-of-digital-therapeutics-1265">
        <title>The Future of Longevity Medicine from the Lens of Digital Therapeutics</title>
        <link>https://www.scipublications.com/journal/jaibd/article/the-future-of-longevity-medicine-from-the-lens-of-digital-therapeutics-1265</link>
        <description>Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements in DTx is the integration of artificial intelligence (AI) and machine learning (ML) to tailor interv...</description>
        <dc:creator>Akshay Ramakrishnan, Raju Rhee, Gunjan Lath, Riya Ramakrishnan</dc:creator>
        <dc:date>2025-02-08</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2025.1265</dc:identifier>
        <pubDate>Sat, 08 Feb 2025 16:00:00 GMT</pubDate>
        <dc:subject>Longevity</dc:subject>
        <dc:subject>Aging</dc:subject>
        <dc:subject>Anti-Aging Therapies</dc:subject>
        <dc:subject>Regenerative Medicine</dc:subject>
        <dc:subject>Genomics</dc:subject>
        <dc:subject>Senescence</dc:subject>
        <dc:subject>Wearable Technology</dc:subject>
        <dc:subject>Nutritional Interventions</dc:subject>
        <dc:subject>AI in Longevity</dc:subject>
        <dc:subject>Personalized Medicine</dc:subject>
        <prism:volume>5</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>29</prism:startingPage>
        <prism:endingPage>37</prism:endingPage>
        <prism:doi>10.31586/jaibd.2025.1265</prism:doi>
        <dcterms:abstract>Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements in DTx is the integration of artificial intelligence (AI) and machine learning (ML) to tailor interventions based on individual health data. This personalization enhances the effectiveness of treatments and supports preventive care by identifying risk factors early. The need for digital therapeutics is underscored by the rising prevalence of NCDs, which are responsible for a significant portion of global mortality and healthcare costs. Traditional healthcare systems often struggle to provide timely and personalized care, especially in low-resource settings. DTx can bridge this gap by offering cost-effective solutions that are easily scalable. Moreover, digital therapeutics can address health inequities by providing low-cost interventions to underserved populations, thereby reducing the burden of NCDs and improving overall health outcomes. As technology continues to evolve, the potential for DTx to enhance longevity and quality of life becomes increasingly promising. Recent advancements in longevity medicine and technology have focused on extending both lifespan and healthspan, ensuring that people not only live longer but also maintain good health throughout their extended years. This review article highlights these advancements that are contributing to this compelling subject of Longevity.</dcterms:abstract>
        <dcterms:issued>2025-02-08</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>The Future of Longevity Medicine from the Lens of Digital Therapeutics</h2>
    <p class="authors">Akshay Ramakrishnan, Raju Rhee, Gunjan Lath, Riya Ramakrishnan</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - February 08, 2025</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements in DTx is the integration of artificial intelligence (AI) and machine learning (ML) to tailor interventions based on individual health data. This personalization enhances the effectiveness of treatments and supports preventive care by identifying risk factors early. The need for digital therapeutics is underscored by the rising prevalence of NCDs, which are responsible for a significant portion of global mortality and healthcare costs. Traditional healthcare systems often struggle to provide timely and personalized care, especially in low-resource settings. DTx can bridge this gap by offering cost-effective solutions that are easily scalable. Moreover, digital therapeutics can address health inequities by providing low-cost interventions to underserved populations, thereby reducing the burden of NCDs and improving overall health outcomes. As technology continues to evolve, the potential for DTx to enhance longevity and quality of life becomes increasingly promising. Recent advancements in longevity medicine and technology have focused on extending both lifespan and healthspan, ensuring that people not only live longer but also maintain good health throughout their extended years. This review article highlights these advancements that are contributing to this compelling subject of Longevity.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1265/780">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/tech-transformations:-modern-solutions-for-obstructive-sleep-apnea-1248">
        <title>Tech Transformations: Modern Solutions for Obstructive Sleep Apnea</title>
        <link>https://www.scipublications.com/journal/jaibd/article/tech-transformations:-modern-solutions-for-obstructive-sleep-apnea-1248</link>
        <description>Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in patients with a high pre-test probability of OSA. In terms of diagnosis, advancements in wearable technolo...</description>
        <dc:creator>Akshay Ramakrishnan, Raju Rhee, Gunjan Lath, Riya Ramakrishnan</dc:creator>
        <dc:date>2025-01-21</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2025.1248</dc:identifier>
        <pubDate>Tue, 21 Jan 2025 16:00:00 GMT</pubDate>
        <dc:subject>Obstructive Sleep Apnea</dc:subject>
        <dc:subject>OSA</dc:subject>
        <dc:subject>Advancements in OSA</dc:subject>
        <dc:subject>Digital Health in OSA</dc:subject>
        <dc:subject>Wearable Technology in OSA</dc:subject>
        <dc:subject>Machine Learning in OSA</dc:subject>
        <dc:subject>Home Sleep Apnea Testing (HSAT)</dc:subject>
        <dc:subject>Continuous Positive Airway Pressure (CPAP)</dc:subject>
        <dc:subject>Mandibular Advancement Devices</dc:subject>
        <dc:subject>Hypoglossal Nerve Stimulation</dc:subject>
        <prism:volume>5</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>14</prism:startingPage>
        <prism:endingPage>28</prism:endingPage>
        <prism:doi>10.31586/jaibd.2025.1248</prism:doi>
        <dcterms:abstract>Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in patients with a high pre-test probability of OSA. In terms of diagnosis, advancements in wearable technology and mobile health applications have enabled continuous monitoring of sleep patterns and respiratory parameters. These tools provide valuable data that can be used to identify OSA more accurately and promptly. Additionally, machine learning algorithms are being integrated into diagnostic processes to enhance the accuracy of OSA detection by analyzing large datasets and identifying patterns indicative of the condition. Management of OSA has also seen significant progress. Continuous positive airway pressure (CPAP) therapy remains the gold standard, but new developments include auto-adjusting CPAP devices that optimize pressure settings based on real-time feedback. Mandibular advancement devices and hypoglossal nerve stimulation are emerging as effective alternatives for patients who are CPAP-intolerant. Furthermore, lifestyle interventions such as weight management, positional therapy, and exercise have been shown to complement medical treatments, leading to better overall outcomes. This review article highlights these advancements that collectively contribute to improved patient adherence, reduced symptoms, and enhanced quality of life for individuals with OSA.</dcterms:abstract>
        <dcterms:issued>2025-01-21</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Tech Transformations: Modern Solutions for Obstructive Sleep Apnea</h2>
    <p class="authors">Akshay Ramakrishnan, Raju Rhee, Gunjan Lath, Riya Ramakrishnan</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - January 21, 2025</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in patients with a high pre-test probability of OSA. In terms of diagnosis, advancements in wearable technology and mobile health applications have enabled continuous monitoring of sleep patterns and respiratory parameters. These tools provide valuable data that can be used to identify OSA more accurately and promptly. Additionally, machine learning algorithms are being integrated into diagnostic processes to enhance the accuracy of OSA detection by analyzing large datasets and identifying patterns indicative of the condition. Management of OSA has also seen significant progress. Continuous positive airway pressure (CPAP) therapy remains the gold standard, but new developments include auto-adjusting CPAP devices that optimize pressure settings based on real-time feedback. Mandibular advancement devices and hypoglossal nerve stimulation are emerging as effective alternatives for patients who are CPAP-intolerant. Furthermore, lifestyle interventions such as weight management, positional therapy, and exercise have been shown to complement medical treatments, leading to better overall outcomes. This review article highlights these advancements that collectively contribute to improved patient adherence, reduced symptoms, and enhanced quality of life for individuals with OSA.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1248/758">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/artificial-immune-systems:-a-bio-inspired-paradigm-for-computational-intelligence-1233">
        <title>Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence</title>
        <link>https://www.scipublications.com/journal/jaibd/article/artificial-immune-systems:-a-bio-inspired-paradigm-for-computational-intelligence-1233</link>
        <description>Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and aut...</description>
        <dc:creator>Praveen Kumar Myakala, Chiranjeevi Bura, Anil Kumar Jonnalagadda</dc:creator>
        <dc:date>2025-01-09</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2025.1233</dc:identifier>
        <pubDate>Thu, 09 Jan 2025 16:00:00 GMT</pubDate>
        <dc:subject>Artificial Immune Systems (AIS); Bio-inspired Computation; Computational Intelligence; Clonal Selection Algorithm; Anomaly Detection</dc:subject>
        <prism:volume>5</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>1</prism:startingPage>
        <prism:endingPage>13</prism:endingPage>
        <prism:doi>10.31586/jaibd.2025.1233</prism:doi>
        <dcterms:abstract>Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS&apos;s transformative potential across diverse computational fields.</dcterms:abstract>
        <dcterms:issued>2025-01-09</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence</h2>
    <p class="authors">Praveen Kumar Myakala, Chiranjeevi Bura, Anil Kumar Jonnalagadda</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - January 09, 2025</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS&apos;s transformative potential across diverse computational fields.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1233/748">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/ai-for-time-series-and-anomaly-detection-1399">
        <title>AI for Time Series and Anomaly Detection</title>
        <link>https://www.scipublications.com/journal/jaibd/article/ai-for-time-series-and-anomaly-detection-1399</link>
        <description>Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent advances in artificial intelligence particularly deep learning architectures such as recurrent neura...</description>
        <dc:creator>Ravi Teja Avireneni, Sri Harsha Koneru, Naresh Kiran Kumar Reddy Yelkoti, Sivaprasad Yerneni Khaga</dc:creator>
        <dc:date>2024-12-19</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.1399</dc:identifier>
        <pubDate>Thu, 19 Dec 2024 16:00:00 GMT</pubDate>
        <dc:subject>Time Series Forecasting; Anomaly Detection; Deep Learning; Multivariate Time Series; Artificial Intelligence; Real-Time Systems</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>2</prism:issue>
        <prism:startingPage>120</prism:startingPage>
        <prism:endingPage>131</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.1399</prism:doi>
        <dcterms:abstract>Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent advances in artificial intelligence particularly deep learning architectures such as recurrent neural networks (RNNs), convolutional neural networks (CNNs), temporal convolutional networks (TCNs), graph neural networks (GNNs) and Transformers have demonstrated marked improvements in modeling both univariate and multivariate series, as well as in detecting anomalies that deviate from learned norms (Darban, Webb, Pan, Aggarwal, &amp;#x26; Salehi, 2022; Chiranjeevi, Ramya, Balaji, Shashank, &amp;#x26; Reddy, 2024) [1,2]. Moreover, ensemble techniques and hybrid signal-processing + deep-learning pipelines show enhanced sensitivity and adaptability in real-world anomaly detection scenarios (Iqbal, Amin, Alsubaei, &amp;#x26; Alzahrani, 2024) [3]. In this work, we provide a unified survey and comparative analysis of AI-driven time series forecasting and anomaly detection methods, highlight key industrial application domains, evaluate performance trade-offs (e.g., accuracy vs. latency, supervised vs. unsupervised learning), and discuss emerging challenges including interpretability, data drift, real-time deployment on edge devices, and integration of causal reasoning. Our findings suggest that while AI approaches significantly outperform classical techniques in many settings, careful consideration of data characteristics, evaluation metrics and deployment environment remains essential for effective adoption.</dcterms:abstract>
        <dcterms:issued>2024-12-19</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>AI for Time Series and Anomaly Detection</h2>
    <p class="authors">Ravi Teja Avireneni, Sri Harsha Koneru, Naresh Kiran Kumar Reddy Yelkoti, Sivaprasad Yerneni Khaga</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - December 19, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent advances in artificial intelligence particularly deep learning architectures such as recurrent neural networks (RNNs), convolutional neural networks (CNNs), temporal convolutional networks (TCNs), graph neural networks (GNNs) and Transformers have demonstrated marked improvements in modeling both univariate and multivariate series, as well as in detecting anomalies that deviate from learned norms (Darban, Webb, Pan, Aggarwal, &amp;#x26; Salehi, 2022; Chiranjeevi, Ramya, Balaji, Shashank, &amp;#x26; Reddy, 2024) [1,2]. Moreover, ensemble techniques and hybrid signal-processing + deep-learning pipelines show enhanced sensitivity and adaptability in real-world anomaly detection scenarios (Iqbal, Amin, Alsubaei, &amp;#x26; Alzahrani, 2024) [3]. In this work, we provide a unified survey and comparative analysis of AI-driven time series forecasting and anomaly detection methods, highlight key industrial application domains, evaluate performance trade-offs (e.g., accuracy vs. latency, supervised vs. unsupervised learning), and discuss emerging challenges including interpretability, data drift, real-time deployment on edge devices, and integration of causal reasoning. Our findings suggest that while AI approaches significantly outperform classical techniques in many settings, careful consideration of data characteristics, evaluation metrics and deployment environment remains essential for effective adoption.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1399/933">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/intelligent-detection-of-injection-attacks-via-sql-based-on-supervised-machine-learning-models-for-enhancing-web-security-1333">
        <title>Intelligent Detection of Injection Attacks via SQL Based on Supervised Machine Learning Models for Enhancing Web Security</title>
        <link>https://www.scipublications.com/journal/jaibd/article/intelligent-detection-of-injection-attacks-via-sql-based-on-supervised-machine-learning-models-for-enhancing-web-security-1333</link>
        <description>The most prevalent technique behind security data breaches exists through SQL Injection Attacks. Organizations and individuals suffer from sensitive information exposure and unauthorized entry when attackers take advantage of SQL injection (SQLi) attack vulnerability’s severe risks. Static and heuristic defense methods remain conventional detection tools for previous SQL injection attacks study&apos;s foundation is a detection system developed using the Gated Recurrent Unit (GRU) network, which attem...</description>
        <dc:creator>Rahul Vadisetty, Purna Chandra Rao Chinta, Chethan Sriharsha Moore, Laxmana Murthy Karaka, Manikanth Sakuru, Varun Bodepudi, Srinivasa Rao Maka, Srikanth Reddy Vangala</dc:creator>
        <dc:date>2024-12-18</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.1333</dc:identifier>
        <pubDate>Wed, 18 Dec 2024 16:00:00 GMT</pubDate>
        <dc:subject>Web Application Security</dc:subject>
        <dc:subject>Cyberattacks</dc:subject>
        <dc:subject>SQL Injection Attacks (SQLIA)</dc:subject>
        <dc:subject>Machine Learning (ML)</dc:subject>
        <dc:subject>SQL Injection Dataset</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>2</prism:issue>
        <prism:startingPage>109</prism:startingPage>
        <prism:endingPage>119</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.1333</prism:doi>
        <dcterms:abstract>The most prevalent technique behind security data breaches exists through SQL Injection Attacks. Organizations and individuals suffer from sensitive information exposure and unauthorized entry when attackers take advantage of SQL injection (SQLi) attack vulnerability’s severe risks. Static and heuristic defense methods remain conventional detection tools for previous SQL injection attacks study&apos;s foundation is a detection system developed using the Gated Recurrent Unit (GRU) network, which attempts to efficiently identify SQL Injection attacks (SQLIAs). The suggested Gated Recurrent Unit model was trained using an 80:20 train-test split, and the results showed that SQL injection attacks could be accurately identified with a precision rate of 97%, an accuracy rate of 96.65%, a recall rate of 92.5%, and an F1-score of 94%. The experimental results, together with their corresponding confusion matrix analysis and learning curves, demonstrate resilience and outstanding generalization ability. The GRU model outperforms conventional machine learning (ML) models, including K-Nearest Neighbor’s (KNN), and Support Vector Machine (SVM), in terms of identifying sequential patterns in SQL query data. Recurrent neural architecture proves effective in the detection of SQLi attacks through its ability to provide secure protection for contemporary web applications.</dcterms:abstract>
        <dcterms:issued>2024-12-18</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Intelligent Detection of Injection Attacks via SQL Based on Supervised Machine Learning Models for Enhancing Web Security</h2>
    <p class="authors">Rahul Vadisetty, Purna Chandra Rao Chinta, Chethan Sriharsha Moore, Laxmana Murthy Karaka, Manikanth Sakuru, Varun Bodepudi, Srinivasa Rao Maka, Srikanth Reddy Vangala</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - December 18, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>The most prevalent technique behind security data breaches exists through SQL Injection Attacks. Organizations and individuals suffer from sensitive information exposure and unauthorized entry when attackers take advantage of SQL injection (SQLi) attack vulnerability’s severe risks. Static and heuristic defense methods remain conventional detection tools for previous SQL injection attacks study&apos;s foundation is a detection system developed using the Gated Recurrent Unit (GRU) network, which attempts to efficiently identify SQL Injection attacks (SQLIAs). The suggested Gated Recurrent Unit model was trained using an 80:20 train-test split, and the results showed that SQL injection attacks could be accurately identified with a precision rate of 97%, an accuracy rate of 96.65%, a recall rate of 92.5%, and an F1-score of 94%. The experimental results, together with their corresponding confusion matrix analysis and learning curves, demonstrate resilience and outstanding generalization ability. The GRU model outperforms conventional machine learning (ML) models, including K-Nearest Neighbor’s (KNN), and Support Vector Machine (SVM), in terms of identifying sequential patterns in SQL query data. Recurrent neural architecture proves effective in the detection of SQLi attacks through its ability to provide secure protection for contemporary web applications.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1333/851">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/an-analysis-of-performance-and-comparison-of-models-for-cardiovascular-disease-prediction-via-machine-learning-models-in-healthcare-1332">
        <title>An Analysis of Performance and Comparison of Models for Cardiovascular Disease Prediction via Machine Learning Models in Healthcare</title>
        <link>https://www.scipublications.com/journal/jaibd/article/an-analysis-of-performance-and-comparison-of-models-for-cardiovascular-disease-prediction-via-machine-learning-models-in-healthcare-1332</link>
        <description>Over the past few decades, cardiovascular disease and related complications have surpassed all others as the important causes of death on a universal scale. At the moment, they are the important cause of mortality universal, including in India. It is important to know how to find cardiovascular problems early so that patients get better care and prices go down. This project utilizes the UCI Heart Disease Dataset to develop ML and DL models capable of detecting cardiac diseases. Heart illness was...</description>
        <dc:creator>Anand Polamarasetti, Krishna Madhav Jha, Vasu Velaga, Kishan Kumar Routhu, Gangadhar Sadaram, Suneel Babu Boppana, Srikanth Reddy Vangala</dc:creator>
        <dc:date>2024-12-16</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.1332</dc:identifier>
        <pubDate>Mon, 16 Dec 2024 16:00:00 GMT</pubDate>
        <dc:subject>Cardiovascular Disease</dc:subject>
        <dc:subject>Heart Disease Classification</dc:subject>
        <dc:subject>Convolutional Neural Network (CNN)</dc:subject>
        <dc:subject>UCI Heart Disease Dataset</dc:subject>
        <dc:subject>Sigmoid Function</dc:subject>
        <dc:subject>Machine Learning (ML)</dc:subject>
        <dc:subject>Deep Learning (DL)</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>2</prism:issue>
        <prism:startingPage>96</prism:startingPage>
        <prism:endingPage>108</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.1332</prism:doi>
        <dcterms:abstract>Over the past few decades, cardiovascular disease and related complications have surpassed all others as the important causes of death on a universal scale. At the moment, they are the important cause of mortality universal, including in India. It is important to know how to find cardiovascular problems early so that patients get better care and prices go down. This project utilizes the UCI Heart Disease Dataset to develop ML and DL models capable of detecting cardiac diseases. Heart illness was categorized using Convolutional Neural Network (CNN) models, which are able to detect intricate patterns in supplied data. A confusion matrix rating, an F1-score, a ROC curve, accuracy, precision, and recall were some of the measures used to grade the model. It did much better than the Neural Network, Deep Neural Network (DNN), and Gradient Boosted Trees (GBT) models, with 91.71% accuracy, 88.88% precision, 82.75% memory, and 85.70% F1-score. Comparative study showed that CNN was the most accurate model. Other models had different balances between accuracy and recall. The experiment results show that the optional CNN model is a decent way to identify cardiovascular disease. This means that it could be used in healthcare systems to find diseases earlier and treat patients better.</dcterms:abstract>
        <dcterms:issued>2024-12-16</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>An Analysis of Performance and Comparison of Models for Cardiovascular Disease Prediction via Machine Learning Models in Healthcare</h2>
    <p class="authors">Anand Polamarasetti, Krishna Madhav Jha, Vasu Velaga, Kishan Kumar Routhu, Gangadhar Sadaram, Suneel Babu Boppana, Srikanth Reddy Vangala</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - December 16, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Over the past few decades, cardiovascular disease and related complications have surpassed all others as the important causes of death on a universal scale. At the moment, they are the important cause of mortality universal, including in India. It is important to know how to find cardiovascular problems early so that patients get better care and prices go down. This project utilizes the UCI Heart Disease Dataset to develop ML and DL models capable of detecting cardiac diseases. Heart illness was categorized using Convolutional Neural Network (CNN) models, which are able to detect intricate patterns in supplied data. A confusion matrix rating, an F1-score, a ROC curve, accuracy, precision, and recall were some of the measures used to grade the model. It did much better than the Neural Network, Deep Neural Network (DNN), and Gradient Boosted Trees (GBT) models, with 91.71% accuracy, 88.88% precision, 82.75% memory, and 85.70% F1-score. Comparative study showed that CNN was the most accurate model. Other models had different balances between accuracy and recall. The experiment results show that the optional CNN model is a decent way to identify cardiovascular disease. This means that it could be used in healthcare systems to find diseases earlier and treat patients better.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1332/850">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/disaster-recovery-and-application-security-in-microservices:-exploring-kubernetes-application-gateways-and-cloud-solutions-for-high-availability-1209">
        <title>Disaster Recovery and Application Security in Microservices: Exploring Kubernetes, Application Gateways, and Cloud Solutions for High Availability</title>
        <link>https://www.scipublications.com/journal/jaibd/article/disaster-recovery-and-application-security-in-microservices:-exploring-kubernetes-application-gateways-and-cloud-solutions-for-high-availability-1209</link>
        <description>Unfortunately, it is not disaster recovery, high availability, or cloud technologies that are inherently difficult to understand, but rather the action of implementing them for software applications that is difficult. The unique method of implementation for a microservices architecture is explored. Regulatory compliance doesn’t stop just because an effective disaster recovery requirement is tough to satisfy for infrastructure unique to sleek microservices. The high-availability location transpar...</description>
        <dc:creator>Manogna Dolu Surabhi</dc:creator>
        <dc:date>2024-12-16</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.1209</dc:identifier>
        <pubDate>Mon, 16 Dec 2024 16:00:00 GMT</pubDate>
        <dc:subject>Microservices Architecture</dc:subject>
        <dc:subject>Disaster Recovery</dc:subject>
        <dc:subject>High Availability</dc:subject>
        <dc:subject>Regulatory Compliance</dc:subject>
        <dc:subject>RESTful Microservices</dc:subject>
        <dc:subject>Cloud Technologies</dc:subject>
        <dc:subject>Security Engineering</dc:subject>
        <dc:subject>Service Bus Relays</dc:subject>
        <dc:subject>Application Resilience</dc:subject>
        <dc:subject>Delivery Latency</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>2</prism:issue>
        <prism:startingPage>82</prism:startingPage>
        <prism:endingPage>95</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.1209</prism:doi>
        <dcterms:abstract>Unfortunately, it is not disaster recovery, high availability, or cloud technologies that are inherently difficult to understand, but rather the action of implementing them for software applications that is difficult. The unique method of implementation for a microservices architecture is explored. Regulatory compliance doesn’t stop just because an effective disaster recovery requirement is tough to satisfy for infrastructure unique to sleek microservices. The high-availability location transparency bliss offered by a cloud solution is appealing to a security engineering department. However, the headache starts when the technology presents a handful of undesirable surprises that leak RESTful microservices to the outside world. These are the challenges that post-SOA cloud-resident robustly scalable applications will need to address and overcome. The goal is to explore several popular methods of accomplishing these tough objectives so that engineers can further research the most practical solution. An innovative implementation that leverages Service Bus relays as an elegant disaster recovery solution while enforcing a strict subnet where RESTful microservices solely live will be discussed. The curiosity lies in the atypical experimentation beyond basic gateways and the facility of using such simplicity while still answering day-to-day software development infrastructure challenges for applications we build. Resilient full-service web proxy service crashes and delivery latency switches by harnessing the microservices pod health will also be discussed [1].</dcterms:abstract>
        <dcterms:issued>2024-12-16</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Disaster Recovery and Application Security in Microservices: Exploring Kubernetes, Application Gateways, and Cloud Solutions for High Availability</h2>
    <p class="authors">Manogna Dolu Surabhi</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - December 16, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Unfortunately, it is not disaster recovery, high availability, or cloud technologies that are inherently difficult to understand, but rather the action of implementing them for software applications that is difficult. The unique method of implementation for a microservices architecture is explored. Regulatory compliance doesn’t stop just because an effective disaster recovery requirement is tough to satisfy for infrastructure unique to sleek microservices. The high-availability location transparency bliss offered by a cloud solution is appealing to a security engineering department. However, the headache starts when the technology presents a handful of undesirable surprises that leak RESTful microservices to the outside world. These are the challenges that post-SOA cloud-resident robustly scalable applications will need to address and overcome. The goal is to explore several popular methods of accomplishing these tough objectives so that engineers can further research the most practical solution. An innovative implementation that leverages Service Bus relays as an elegant disaster recovery solution while enforcing a strict subnet where RESTful microservices solely live will be discussed. The curiosity lies in the atypical experimentation beyond basic gateways and the facility of using such simplicity while still answering day-to-day software development infrastructure challenges for applications we build. Resilient full-service web proxy service crashes and delivery latency switches by harnessing the microservices pod health will also be discussed [1].</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1209/731">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/digital-therapeutics:-a-new-dimension-to-diabetes-mellitus-management-1090">
        <title>Digital Therapeutics: A New Dimension to Diabetes Mellitus Management</title>
        <link>https://www.scipublications.com/journal/jaibd/article/digital-therapeutics:-a-new-dimension-to-diabetes-mellitus-management-1090</link>
        <description>Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabet...</description>
        <dc:creator>Raju Rhee, Rahul K Jaiswal, Gunjan Lath</dc:creator>
        <dc:date>2024-11-15</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.1090</dc:identifier>
        <pubDate>Fri, 15 Nov 2024 16:00:00 GMT</pubDate>
        <dc:subject>Digital Therapeutics</dc:subject>
        <dc:subject>Digital Health</dc:subject>
        <dc:subject>Software as a Medical Device (SaMD)</dc:subject>
        <dc:subject>Healthcare Digitization</dc:subject>
        <dc:subject>Technology in Healthcare</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>2</prism:issue>
        <prism:startingPage>74</prism:startingPage>
        <prism:endingPage>81</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.1090</prism:doi>
        <dcterms:abstract>Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabetes control. The importance of DTx lies in their ability to make diabetes care more accessible and convenient. Mobile apps and telemedicine platforms enable patients to receive support and guidance from anywhere, reducing the need for frequent in-person visits. Additionally, DTx often include behavioral support features like reminders, educational content, and motivational tools, which are crucial for maintaining healthy habits and managing stress. Currently, the dynamics of DTx in diabetes are rapidly evolving, with increasing integration of artificial intelligence and machine learning to further personalize and optimize care. As the adoption of these technologies grows, they hold the potential to significantly improve patient outcomes and revolutionize diabetes management on a global scale. This article will focus on the benefits of novel digital therapeutics for prevention and management of type II diabetes that are currently available in the market.</dcterms:abstract>
        <dcterms:issued>2024-11-15</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Digital Therapeutics: A New Dimension to Diabetes Mellitus Management</h2>
    <p class="authors">Raju Rhee, Rahul K Jaiswal, Gunjan Lath</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - November 15, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabetes control. The importance of DTx lies in their ability to make diabetes care more accessible and convenient. Mobile apps and telemedicine platforms enable patients to receive support and guidance from anywhere, reducing the need for frequent in-person visits. Additionally, DTx often include behavioral support features like reminders, educational content, and motivational tools, which are crucial for maintaining healthy habits and managing stress. Currently, the dynamics of DTx in diabetes are rapidly evolving, with increasing integration of artificial intelligence and machine learning to further personalize and optimize care. As the adoption of these technologies grows, they hold the potential to significantly improve patient outcomes and revolutionize diabetes management on a global scale. This article will focus on the benefits of novel digital therapeutics for prevention and management of type II diabetes that are currently available in the market.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1090/705">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/ai-powered-optimization-for-high-performance-computing-in-scientific-simulations-1695">
        <title>AI-Powered Optimization for High-Performance Computing in Scientific Simulations</title>
        <link>https://www.scipublications.com/journal/jaibd/article/ai-powered-optimization-for-high-performance-computing-in-scientific-simulations-1695</link>
        <description>High-Performance Computing (HPC) is indispensable for large-scale scientific simulations, but achieving optimal performance on modern supercomputers is increasingly challenging. As HPC systems scale toward exascale, they face escalating complexity in hardware, software, and workloads. Traditional optimization methods (manual tuning and heuristic algorithms) struggle to cope with dynamic workloads and intricate system behaviors. Artificial Intelligence (AI) techniques offer a promising approach t...</description>
        <dc:creator>Shubham Gupta</dc:creator>
        <dc:date>2024-07-29</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.1695</dc:identifier>
        <pubDate>Mon, 29 Jul 2024 16:00:00 GMT</pubDate>
        <dc:subject>High-Performance Computing (HPC); Artificial Intelligence; Machine Learning; Deep Reinforcement Learning; Surrogate Modeling; Bayesian Optimization; Scientific Simulations; Exascale Computing; Job Scheduling; Computational Fluid Dynamics; Autotuning; Neural Networks; Resource Management; Performance Optimization; In-Situ Analytics</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>1</prism:startingPage>
        <prism:endingPage>8</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.1695</prism:doi>
        <dcterms:abstract>High-Performance Computing (HPC) is indispensable for large-scale scientific simulations, but achieving optimal performance on modern supercomputers is increasingly challenging. As HPC systems scale toward exascale, they face escalating complexity in hardware, software, and workloads. Traditional optimization methods (manual tuning and heuristic algorithms) struggle to cope with dynamic workloads and intricate system behaviors. Artificial Intelligence (AI) techniques offer a promising approach to address these challenges. This article provides an overview of AI-powered optimization in HPC, focusing on how machine learning and related AI methods enhance performance, efficiency, and scalability of scientific simulations. We survey key AI techniques applied to HPC optimization including machine learning for performance modeling, deep reinforcement learning for resource management, and AI-driven surrogate models for accelerating simulations and illustrate their impact through case studies in domains such as job scheduling and fluid dynamics. We discuss the practical applications of these techniques, highlighting reported performance gains (e.g., substantial reductions in simulation run time and improved resource utilization). We also examine the challenges in integrating AI with HPC (such as training overhead, data movement, and reliability concerns) and outline future directions for research. The convergence of AI and HPC is poised to produce “smart” simulation workflows that intelligently adapt and optimize in real time, pushing the frontiers of scientific computing.</dcterms:abstract>
        <dcterms:issued>2024-07-29</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>AI-Powered Optimization for High-Performance Computing in Scientific Simulations</h2>
    <p class="authors">Shubham Gupta</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - July 29, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>High-Performance Computing (HPC) is indispensable for large-scale scientific simulations, but achieving optimal performance on modern supercomputers is increasingly challenging. As HPC systems scale toward exascale, they face escalating complexity in hardware, software, and workloads. Traditional optimization methods (manual tuning and heuristic algorithms) struggle to cope with dynamic workloads and intricate system behaviors. Artificial Intelligence (AI) techniques offer a promising approach to address these challenges. This article provides an overview of AI-powered optimization in HPC, focusing on how machine learning and related AI methods enhance performance, efficiency, and scalability of scientific simulations. We survey key AI techniques applied to HPC optimization including machine learning for performance modeling, deep reinforcement learning for resource management, and AI-driven surrogate models for accelerating simulations and illustrate their impact through case studies in domains such as job scheduling and fluid dynamics. We discuss the practical applications of these techniques, highlighting reported performance gains (e.g., substantial reductions in simulation run time and improved resource utilization). We also examine the challenges in integrating AI with HPC (such as training overhead, data movement, and reliability concerns) and outline future directions for research. The convergence of AI and HPC is poised to produce “smart” simulation workflows that intelligently adapt and optimize in real time, pushing the frontiers of scientific computing.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1695/970">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/nigeria-exchange-rate-volatility:-a-comparative-study-of-recurrent-neural-network-lstm-and-exponential-generalized-autoregressive-conditional-heteroskedasticity-models-983">
        <title>Nigeria Exchange Rate Volatility: A Comparative Study of Recurrent Neural Network LSTM and Exponential Generalized Autoregressive Conditional Heteroskedasticity Models</title>
        <link>https://www.scipublications.com/journal/jaibd/article/nigeria-exchange-rate-volatility:-a-comparative-study-of-recurrent-neural-network-lstm-and-exponential-generalized-autoregressive-conditional-heteroskedasticity-models-983</link>
        <description>Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long short term memory (LSTM) model with the combinations of p = 10 and q = 1 layers to model the volatili...</description>
        <dc:creator>Samuel Olorunfemi Adams, John Innocent Uchema</dc:creator>
        <dc:date>2024-06-27</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.983</dc:identifier>
        <pubDate>Thu, 27 Jun 2024 16:00:00 GMT</pubDate>
        <dc:subject>Exchange Rate Volatility; EGARCH (1</dc:subject>
        <dc:subject>1); LSTM; MLP; RNN</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>2</prism:issue>
        <prism:startingPage>61</prism:startingPage>
        <prism:endingPage>73</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.983</prism:doi>
        <dcterms:abstract>Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long short term memory (LSTM) model with the combinations of p = 10 and q = 1 layers to model the volatility of Nigerian exchange rates. Our goal is to determine the preferred model for predicting Nigeria’s Naira exchange rate volatility with Euro, Pounds and US Dollars. The dataset of monthly exchange rates of the Nigerian Naira to US dollar, Euro and Pound Sterling for the period December 2001 – August 2023 was extracted from the Central Bank of Nigeria Statistical Bulletin. The model efficiency and performance was measured with the Mean Squared Error (MSE) criteria. The results indicated that the Nigeria exchange rate volatility is asymmetric, and leverage effects are evident in the results of the EGARCH (1, 1) model. It was observed also that there is a steady increase in the Nigeria Naira exchange rate with the euro, pounds sterling and US dollar from 2016 to its highest peak in 2023. Result of the comparative analysis indicated that, EGARCH (1,1) performed better than the LSTM model because it provided a smaller MSE values of 224.7, 231.3 and 138.5 for euros, pounds sterling and US Dollars respectively.</dcterms:abstract>
        <dcterms:issued>2024-06-27</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Nigeria Exchange Rate Volatility: A Comparative Study of Recurrent Neural Network LSTM and Exponential Generalized Autoregressive Conditional Heteroskedasticity Models</h2>
    <p class="authors">Samuel Olorunfemi Adams, John Innocent Uchema</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - June 27, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long short term memory (LSTM) model with the combinations of p = 10 and q = 1 layers to model the volatility of Nigerian exchange rates. Our goal is to determine the preferred model for predicting Nigeria’s Naira exchange rate volatility with Euro, Pounds and US Dollars. The dataset of monthly exchange rates of the Nigerian Naira to US dollar, Euro and Pound Sterling for the period December 2001 – August 2023 was extracted from the Central Bank of Nigeria Statistical Bulletin. The model efficiency and performance was measured with the Mean Squared Error (MSE) criteria. The results indicated that the Nigeria exchange rate volatility is asymmetric, and leverage effects are evident in the results of the EGARCH (1, 1) model. It was observed also that there is a steady increase in the Nigeria Naira exchange rate with the euro, pounds sterling and US dollar from 2016 to its highest peak in 2023. Result of the comparative analysis indicated that, EGARCH (1,1) performed better than the LSTM model because it provided a smaller MSE values of 224.7, 231.3 and 138.5 for euros, pounds sterling and US Dollars respectively.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/983/610">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/5v’s-of-big-data-shifted-to-suite-the-context-of-software-code:-big-code-for-big-software-projects-911">
        <title>5V’s of Big Data Shifted to Suite the Context of Software Code: Big Code for Big Software Projects</title>
        <link>https://www.scipublications.com/journal/jaibd/article/5v’s-of-big-data-shifted-to-suite-the-context-of-software-code:-big-code-for-big-software-projects-911</link>
        <description>Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the codebases of the software’s are increasingly growing that is the size of the modules, functionalitie...</description>
        <dc:creator>Linda Susan Amos, Eng. Tirivangani Magadza</dc:creator>
        <dc:date>2024-04-10</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.911</dc:identifier>
        <pubDate>Wed, 10 Apr 2024 16:00:00 GMT</pubDate>
        <dc:subject>Big Code</dc:subject>
        <dc:subject>Big Software</dc:subject>
        <dc:subject>Big Data</dc:subject>
        <dc:subject>5V’s</dc:subject>
        <dc:subject>Code Optimization</dc:subject>
        <dc:subject>Scalability</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>36</prism:startingPage>
        <prism:endingPage>47</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.911</prism:doi>
        <dcterms:abstract>Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the codebases of the software’s are increasingly growing that is the size of the modules, functionalities, the size of the classes etc. Since data is growing so rapidly it also mean the codebases of software’s or code are also growing as well. Therefore, this paper seeks to discuss the 5V’s of big data in the context of software code and how to optimize or manage the big code. When we talk of &quot;Big Code for Big Software&apos;s,&quot; we are referring to the specific challenges and considerations involved in developing, managing, and maintaining of code in large-scale software systems.</dcterms:abstract>
        <dcterms:issued>2024-04-10</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>5V’s of Big Data Shifted to Suite the Context of Software Code: Big Code for Big Software Projects</h2>
    <p class="authors">Linda Susan Amos, Eng. Tirivangani Magadza</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - April 10, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the codebases of the software’s are increasingly growing that is the size of the modules, functionalities, the size of the classes etc. Since data is growing so rapidly it also mean the codebases of software’s or code are also growing as well. Therefore, this paper seeks to discuss the 5V’s of big data in the context of software code and how to optimize or manage the big code. When we talk of &quot;Big Code for Big Software&apos;s,&quot; we are referring to the specific challenges and considerations involved in developing, managing, and maintaining of code in large-scale software systems.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/911/580">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/the-use-of-contemporary-enterprise-resource-planning-(erp)-technologies-for-digital-transformation-881">
        <title>The use of contemporary Enterprise Resource Planning (ERP) technologies for digital transformation</title>
        <link>https://www.scipublications.com/journal/jaibd/article/the-use-of-contemporary-enterprise-resource-planning-(erp)-technologies-for-digital-transformation-881</link>
        <description>Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with numerous, dispersed, and very small structures that are made possible by digitization. Utilizing the...</description>
        <dc:creator>Hariprasad Mandava</dc:creator>
        <dc:date>2024-02-18</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.881</dc:identifier>
        <pubDate>Sun, 18 Feb 2024 16:00:00 GMT</pubDate>
        <dc:subject>Digital Technologies</dc:subject>
        <dc:subject>Enterprise Business Application</dc:subject>
        <dc:subject>Enterprise Resource Planning</dc:subject>
        <dc:subject>Internet of Things</dc:subject>
        <dc:subject>Big Data</dc:subject>
        <dc:subject>S/4 Hana</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>31</prism:startingPage>
        <prism:endingPage>35</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.881</prism:doi>
        <dcterms:abstract>Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with numerous, dispersed, and very small structures that are made possible by digitization. Utilizing the possibilities of cloud computing, mobile systems, big data and analytics, services computing, Internet of Things, collaborative networks, and decision support, numerous new business prospects have emerged throughout the years. The logical basis for robust and self-optimizing run-time environments for intelligent business services and adaptable distributed information systems with service-oriented enterprise architectures comes from biological metaphors of living, dynamic ecosystems. This has a significant effect on how digital services and products are designed from a value- and service-oriented perspective. The evolution of enterprise architectures and the shift from a closed-world modeling environment to a more flexible open-world composition establish the dynamic framework for highly distributed and adaptive systems, which are crucial for enabling the digital transformation. This study examines how enterprise architecture has changed over time, taking into account newly established, value-based relationships between digital business models, digital strategies, and enhanced enterprise architecture.</dcterms:abstract>
        <dcterms:issued>2024-02-18</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>The use of contemporary Enterprise Resource Planning (ERP) technologies for digital transformation</h2>
    <p class="authors">Hariprasad Mandava</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - February 18, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with numerous, dispersed, and very small structures that are made possible by digitization. Utilizing the possibilities of cloud computing, mobile systems, big data and analytics, services computing, Internet of Things, collaborative networks, and decision support, numerous new business prospects have emerged throughout the years. The logical basis for robust and self-optimizing run-time environments for intelligent business services and adaptable distributed information systems with service-oriented enterprise architectures comes from biological metaphors of living, dynamic ecosystems. This has a significant effect on how digital services and products are designed from a value- and service-oriented perspective. The evolution of enterprise architectures and the shift from a closed-world modeling environment to a more flexible open-world composition establish the dynamic framework for highly distributed and adaptive systems, which are crucial for enabling the digital transformation. This study examines how enterprise architecture has changed over time, taking into account newly established, value-based relationships between digital business models, digital strategies, and enhanced enterprise architecture.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/881/561">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/an-appraisal-of-challenges-in-developing-information-literacy-skills-in-the-colleges-of-education-of-ghana-878">
        <title>An Appraisal of Challenges in Developing Information Literacy Skills in the Colleges of Education of Ghana</title>
        <link>https://www.scipublications.com/journal/jaibd/article/an-appraisal-of-challenges-in-developing-information-literacy-skills-in-the-colleges-of-education-of-ghana-878</link>
        <description>The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North Region. Purposive, stratified, and convenience sampling techniques were used to select colleges of ed...</description>
        <dc:creator>Martha Baidoo, William Jones</dc:creator>
        <dc:date>2024-02-17</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.878</dc:identifier>
        <pubDate>Sat, 17 Feb 2024 16:00:00 GMT</pubDate>
        <dc:subject>Challenges</dc:subject>
        <dc:subject>Information Literacy</dc:subject>
        <dc:subject>Skills</dc:subject>
        <dc:subject>Colleges of Education of Ghana</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>19</prism:startingPage>
        <prism:endingPage>30</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.878</prism:doi>
        <dcterms:abstract>The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North Region. Purposive, stratified, and convenience sampling techniques were used to select colleges of education and level 200 students. The three (3) colleges of education were stratified and purposively selected while 256 level 200 students were stratified and conveniently sampled. The study employed questionnaires to collect data from the sampled students. Questionnaires (open and closed-ended questions) focused on the challenges faced by the students in developing their Information Literacy (IL) skills. The quantitative data was captured, analysed, and presented in descriptive statistics such as percentages, and frequency tables, to determine the objective of the study. It is recommended that to improve digital literacy and academic pursuits, the college management should improve access to desktop computers and the Internet in the library and computer centre. It is also recommended that Management and librarians of the Colleges of Education ensure that students have access to these devices at the library and can use them to develop their IL skills and help them manage their references more effectively.</dcterms:abstract>
        <dcterms:issued>2024-02-17</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>An Appraisal of Challenges in Developing Information Literacy Skills in the Colleges of Education of Ghana</h2>
    <p class="authors">Martha Baidoo, William Jones</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - February 17, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North Region. Purposive, stratified, and convenience sampling techniques were used to select colleges of education and level 200 students. The three (3) colleges of education were stratified and purposively selected while 256 level 200 students were stratified and conveniently sampled. The study employed questionnaires to collect data from the sampled students. Questionnaires (open and closed-ended questions) focused on the challenges faced by the students in developing their&lt;b&gt; &lt;/b&gt;Information Literacy (IL) skills. The quantitative data was captured, analysed, and presented in descriptive statistics such as percentages, and frequency tables, to determine the objective of the study. It is recommended that to improve digital literacy and academic pursuits, the college management should improve access to desktop computers and the Internet in the library and computer centre. It is also recommended that Management and librarians of the Colleges of Education ensure that students have access to these devices at the library and can use them to develop their IL skills and help them manage their references more effectively.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/878/559">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/universal-evaluation-of-sap-s/4-hana-erp-cloud-system-882">
        <title>Universal Evaluation of SAP S/4 Hana ERP Cloud System</title>
        <link>https://www.scipublications.com/journal/jaibd/article/universal-evaluation-of-sap-s/4-hana-erp-cloud-system-882</link>
        <description>Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of achieving maximum productivity is not fully utilized. One of the causes of this reality is the underfu...</description>
        <dc:creator>Venkata Pavan Kumar Juturi</dc:creator>
        <dc:date>2024-02-16</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.882</dc:identifier>
        <pubDate>Fri, 16 Feb 2024 16:00:00 GMT</pubDate>
        <dc:subject>Enterprise Resource Planning</dc:subject>
        <dc:subject>Cloud ERP</dc:subject>
        <dc:subject>Artificial Intelligence</dc:subject>
        <dc:subject>Big Data</dc:subject>
        <dc:subject>Business Intelligence System</dc:subject>
        <dc:subject>S/4 HANA</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>14</prism:startingPage>
        <prism:endingPage>18</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.882</prism:doi>
        <dcterms:abstract>Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of achieving maximum productivity is not fully utilized. One of the causes of this reality is the underfunding of ergonomic measures and the newest technologies. Through the design of S4 Hana cloud ERP software applications, we will demonstrate how important and highly recommended ergonomic research is in order to minimize the financial and human costs that enterprises are currently facing.</dcterms:abstract>
        <dcterms:issued>2024-02-16</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Universal Evaluation of SAP S/4 Hana ERP Cloud System</h2>
    <p class="authors">Venkata Pavan Kumar Juturi</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - February 16, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of achieving maximum productivity is not fully utilized. One of the causes of this reality is the underfunding of ergonomic measures and the newest technologies. Through the design of S4 Hana cloud ERP software applications, we will demonstrate how important and highly recommended ergonomic research is in order to minimize the financial and human costs that enterprises are currently facing.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/882/557">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/stock-closing-price-and-trend-prediction-with-lstm-rnn-877">
        <title>Stock Closing Price and Trend Prediction with LSTM-RNN</title>
        <link>https://www.scipublications.com/journal/jaibd/article/stock-closing-price-and-trend-prediction-with-lstm-rnn-877</link>
        <description>The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers&apos; investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing p...</description>
        <dc:creator>Vivek Varadharajan, Nathan Smith, Dinesh Kalla, Ganesh R Kumar, Fnu Samaah, Kiran Polimetla</dc:creator>
        <dc:date>2024-02-14</dc:date>
        <dc:type>Article</dc:type>
        <dc:identifier>10.31586/jaibd.2024.877</dc:identifier>
        <pubDate>Wed, 14 Feb 2024 16:00:00 GMT</pubDate>
        <dc:subject>Stock Price Prediction</dc:subject>
        <dc:subject>Artificial Intelligence</dc:subject>
        <dc:subject>Machine Learning</dc:subject>
        <dc:subject>LSTM-RNN</dc:subject>
        <prism:volume>4</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>1</prism:startingPage>
        <prism:endingPage>13</prism:endingPage>
        <prism:doi>10.31586/jaibd.2024.877</prism:doi>
        <dcterms:abstract>The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers&apos; investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing price of the Amazon Inc. stock (ticker symbol: AMZN). We study the influence of various hyperparameters in the model to see what factors the predictive power of the model. The root mean squared error (RMSE) on the training was 2.51 with a mean absolute percentage error (MAPE) of 1.84%.</dcterms:abstract>
        <dcterms:issued>2024-02-14</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Stock Closing Price and Trend Prediction with LSTM-RNN</h2>
    <p class="authors">Vivek Varadharajan, Nathan Smith, Dinesh Kalla, Ganesh R Kumar, Fnu Samaah, Kiran Polimetla</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - February 14, 2024</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers&apos; investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing price of the Amazon Inc. stock (ticker symbol: AMZN). We study the influence of various hyperparameters in the model to see what factors the predictive power of the model. The root mean squared error (RMSE) on the training was 2.51 with a mean absolute percentage error (MAPE) of 1.84%.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/877/555">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/understanding-the-fundamentals-of-digital-transformation-in-financial-services:-drivers-and-strategic-insights-1216">
        <title>Understanding the Fundamentals of Digital Transformation in Financial Services: Drivers and Strategic Insights</title>
        <link>https://www.scipublications.com/journal/jaibd/article/understanding-the-fundamentals-of-digital-transformation-in-financial-services:-drivers-and-strategic-insights-1216</link>
        <description>The current financial services sector is realising considerable changes in its operations due to development in technology and embracing of digital platforms. This evolution is changing the established concepts of business, consumers and channels of delivery of services. Financial services firms are changing the way they work through digital transformation due to developments in technology, changes in customer needs, and an increase in emphasis on sustainability. Understanding the opportunities,...</description>
        <dc:creator>Gangadhar Sadaram, Manikanth Sakuru, Krishna Madhav Jha, Varun Bodepudi, Niharika Katnapally, Srinivasa Rao Maka, Laxmana Murthy Karaka</dc:creator>
        <dc:date>2023-12-26</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2023.1216</dc:identifier>
        <pubDate>Tue, 26 Dec 2023 16:00:00 GMT</pubDate>
        <dc:subject>Finance</dc:subject>
        <dc:subject>Digital Transformation</dc:subject>
        <dc:subject>Financial Services</dc:subject>
        <dc:subject>FinTech</dc:subject>
        <dc:subject>Digital Banking</dc:subject>
        <dc:subject>Technological Innovation</dc:subject>
        <prism:volume>3</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>72</prism:startingPage>
        <prism:endingPage>83</prism:endingPage>
        <prism:doi>10.31586/jaibd.2023.1216</prism:doi>
        <dcterms:abstract>The current financial services sector is realising considerable changes in its operations due to development in technology and embracing of digital platforms. This evolution is changing the established concepts of business, consumers and channels of delivery of services. Financial services firms are changing the way they work through digital transformation due to developments in technology, changes in customer needs, and an increase in emphasis on sustainability. Understanding the opportunities, risks, and new trends in digital transformation is the focus of this paper. Opportunities include efficient real-time decision-making processes, increased transparency and better process controls, which are balanced by the threats of change management, dubious organization-technology fit, and high implementation costs. The study also examines recent advancements, including the application of machine learning and artificial intelligence, developments in mobile and online banking, integration of blockchain, and increasing focus on security and personalised banking. A literature review yields some findings from different studies on rural financial services, the evolution of the blockchain, drivers of digital transformation, cloud-based learning approaches, and emerging sustainability practices. All of these results suggest that more strategic planning, analytics, and more focus on ensuring that organisational objectives are met with transformations should be pursued. Hence, this research findings add to the existing literature in determining how innovative and digital technologies are likely to transform the financial services sector and advance sustainability.</dcterms:abstract>
        <dcterms:issued>2023-12-26</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Understanding the Fundamentals of Digital Transformation in Financial Services: Drivers and Strategic Insights</h2>
    <p class="authors">Gangadhar Sadaram, Manikanth Sakuru, Krishna Madhav Jha, Varun Bodepudi, Niharika Katnapally, Srinivasa Rao Maka, Laxmana Murthy Karaka</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - December 26, 2023</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>The current financial services sector is realising considerable changes in its operations due to development in technology and embracing of digital platforms. This evolution is changing the established concepts of business, consumers and channels of delivery of services. Financial services firms are changing the way they work through digital transformation due to developments in technology, changes in customer needs, and an increase in emphasis on sustainability. Understanding the opportunities, risks, and new trends in digital transformation is the focus of this paper. Opportunities include efficient real-time decision-making processes, increased transparency and better process controls, which are balanced by the threats of change management, dubious organization-technology fit, and high implementation costs. The study also examines recent advancements, including the application of machine learning and artificial intelligence, developments in mobile and online banking, integration of blockchain, and increasing focus on security and personalised banking. A literature review yields some findings from different studies on rural financial services, the evolution of the blockchain, drivers of digital transformation, cloud-based learning approaches, and emerging sustainability practices. All of these results suggest that more strategic planning, analytics, and more focus on ensuring that organisational objectives are met with transformations should be pursued. Hence, this research findings add to the existing literature in determining how innovative and digital technologies are likely to transform the financial services sector and advance sustainability.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1216/735">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
    
    <item rdf:about="https://www.scipublications.com/journal/jaibd/article/networking-solutions-for-large-scale-iot-deployments:-architectures-challenges-and-trends-1382">
        <title>Networking Solutions for Large-Scale IoT Deployments: Architectures, Challenges, and Trends</title>
        <link>https://www.scipublications.com/journal/jaibd/article/networking-solutions-for-large-scale-iot-deployments:-architectures-challenges-and-trends-1382</link>
        <description>The Internet of Things (IoT) is a network of linked items that can gather, exchange, and act on data thanks to sensors, software, and communication technologies. Networking solutions for large-scale IoT deployments have become a cornerstone in advancing smart applications across industries, cities, and healthcare ecosystems. With billions of connected devices generating vast amounts of data, efficient networking infrastructures are required to ensure seamless communication, reliability, and scal...</description>
        <dc:creator>Ajay Dasari, Venkata Kishore Chilakapati, Srikanth Reddy Keshireddy, Venkata Teja Nagumotu, Harsha Vardhan Reddy Kavuluri, Akhil Kumar Pathani</dc:creator>
        <dc:date>2023-12-26</dc:date>
        <dc:type>Review Article</dc:type>
        <dc:identifier>10.31586/jaibd.2023.1382</dc:identifier>
        <pubDate>Tue, 26 Dec 2023 16:00:00 GMT</pubDate>
        <dc:subject>IoT Networking</dc:subject>
        <dc:subject>Large-Scale Deployments</dc:subject>
        <dc:subject>5G</dc:subject>
        <dc:subject>communication technologies</dc:subject>
        <dc:subject>AI-Driven IoT</dc:subject>
        <dc:subject>SDN/NFV</dc:subject>
        <dc:subject>Blockchain in IoT</dc:subject>
        <prism:volume>3</prism:volume>
        <prism:issue>1</prism:issue>
        <prism:startingPage>102</prism:startingPage>
        <prism:endingPage>112</prism:endingPage>
        <prism:doi>10.31586/jaibd.2023.1382</prism:doi>
        <dcterms:abstract>The Internet of Things (IoT) is a network of linked items that can gather, exchange, and act on data thanks to sensors, software, and communication technologies. Networking solutions for large-scale IoT deployments have become a cornerstone in advancing smart applications across industries, cities, and healthcare ecosystems. With billions of connected devices generating vast amounts of data, efficient networking infrastructures are required to ensure seamless communication, reliability, and scalability. IoT architectures, ranging from layered models to advanced cloud–edge integration, offer structured approaches for managing heterogeneous devices and diverse communication protocols. Recent innovations, such as 5G connectivity, Software-defined networking and network function virtualization provide flexibility and programmability, enabling dynamic adaptation to evolving requirements. Edge and fog computing further enhance responsiveness by processing data closer to devices, while AI-driven approaches contribute to intelligent routing, predictive analytics, and self-optimizing network management. Blockchain-based frameworks add transparency and trust, securing data exchange in decentralized environments perspective on how networking continues to transform the capabilities of IoT, setting the foundation for sustainable growth and future adoption across multiple domains.</dcterms:abstract>
        <dcterms:issued>2023-12-26</dcterms:issued>
        <dcterms:language>en</dcterms:language>
        <content:encoded><![CDATA[<div class="article">
    <h2>Networking Solutions for Large-Scale IoT Deployments: Architectures, Challenges, and Trends</h2>
    <p class="authors">Ajay Dasari, Venkata Kishore Chilakapati, Srikanth Reddy Keshireddy, Venkata Teja Nagumotu, Harsha Vardhan Reddy Kavuluri, Akhil Kumar Pathani</p>
    <p class="journal">Journal of Artificial Intelligence and Big Data - December 26, 2023</p>
    <div class="abstract">
        <h3>Abstract</h3>
        <p>The Internet of Things (IoT) is a network of linked items that can gather, exchange, and act on data thanks to sensors, software, and communication technologies. Networking solutions for large-scale IoT deployments have become a cornerstone in advancing smart applications across industries, cities, and healthcare ecosystems. With billions of connected devices generating vast amounts of data, efficient networking infrastructures are required to ensure seamless communication, reliability, and scalability. IoT architectures, ranging from layered models to advanced cloud–edge integration, offer structured approaches for managing heterogeneous devices and diverse communication protocols. Recent innovations, such as 5G connectivity, Software-defined networking and network function virtualization provide flexibility and programmability, enabling dynamic adaptation to evolving requirements. Edge and fog computing further enhance responsiveness by processing data closer to devices, while AI-driven approaches contribute to intelligent routing, predictive analytics, and self-optimizing network management. Blockchain-based frameworks add transparency and trust, securing data exchange in decentralized environments perspective on how networking continues to transform the capabilities of IoT, setting the foundation for sustainable growth and future adoption across multiple domains.</p>
    </div>
    <div class="pdf-link">
        <a href="https://www.scipublications.com/journal/index.php/JAIBD/article/download/1382/965">Download PDF</a>
    </div>
</div>]]></content:encoded>
    </item>
</rdf:RDF>