Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access January 10, 2025

Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence

Abstract Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a [...] Read more.
Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS's transformative potential across diverse computational fields.
Figures
PreviousNext
Article
Open Access December 16, 2022

A Framework for the Application of Optimization Techniques in the Achievement of Global Emission Targets in the Housing Sector

Abstract The building construction industry holds a crucial role in the reduction of greenhouse gas emissions globally. The targets for greenhouse gas emissions may not be achieved without a defined strategic plan to meet up with the set targets from various sectors of the economy. Recognizing the enormous potential that the building industry holds in contributing to global greenhouse gas GHG emission [...] Read more.
The building construction industry holds a crucial role in the reduction of greenhouse gas emissions globally. The targets for greenhouse gas emissions may not be achieved without a defined strategic plan to meet up with the set targets from various sectors of the economy. Recognizing the enormous potential that the building industry holds in contributing to global greenhouse gas GHG emission reduction, this study describes a framework on how optimization techniques can be used as a guide for emission reduction targets for the housing sector using illustrations of the onsite and offsite building construction industry. Given that some of the GHG gases are also sources of air pollution, this study includes a discussion on how the effort to address air pollution can be used to find a consensus towards addressing the concern about GHG emissions. This study presents procedures for simplified methods of estimation of GHG emissions that various municipalities around the globe can use to estimate and report the emissions from the building construction industry. The study presents a unifying strategy for emission management. The study also demonstrates how programming methods can be applied to GHG emissions management. The approach used in this study is transferable to other industries. The study recommends a unifying strategy for the management and control of emissions in the building construction industry. The study also recommends a coordinated effort in sharing best practices for emission control and management from all jurisdictions globally. In the effort to reduce global emission targets, further studies like this and its expansion is recommended for all sectors of the global economy. It is recommended that these studies should be followed by a concrete effort to achieve good implementation of sustainable emission reduction targets globally.
Figures
PreviousNext
Article
Open Access November 29, 2022

The Application of Machine Learning in the Corona Era, With an Emphasis on Economic Concepts and Sustainable Development Goals

Abstract The aim of this article is to examine the impacts of Coronavirus Disease -19 (Covid-19) vaccines on economic condition and sustainable development goals. In other words, we are going to study the economic condition during Covid19. We have studied the economic costs of pandemic, benefits in terms of gross domestic product (GDP), public finances and employment, investment on vaccines around the [...] Read more.
The aim of this article is to examine the impacts of Coronavirus Disease -19 (Covid-19) vaccines on economic condition and sustainable development goals. In other words, we are going to study the economic condition during Covid19. We have studied the economic costs of pandemic, benefits in terms of gross domestic product (GDP), public finances and employment, investment on vaccines around the world, progress and totally the economic impacts of vaccines and the impacts of emerging markets (EM) on achieving sustainable development goals (SDGs), including no poverty, good health and well-being, zero hunger, reduced inequality etc. The importance of emerging economies in reducing the harmful effects of the Corona has also been noted. We have tried to do experimental results and forecast daily new death cases from Feb-2020 to Aug-2021 in Iran using Artificial Neural Network (ANN) and Beetle Antennae Search (BAS) algorithm as a case study with econometric models and regression analysis. The findings show that Covid19 has had devastating economic and health effects on the world, and the vaccine can be very helpful in eliminating these effects specially in long-term. We observed that there is inequality in the distribution of Corona vaccines in rich countries compared to poor which EM can decrease the gap between them. The results show that both models (i.e., Artificial intelligence (AI) and econometric models) almost have the same results but AI optimization models can robust the model and prediction. The main contribution of this article is that we have surveyed the impacts of vaccination from socio-economic viewpoint not just report some facts and truth. We have surveyed the impacts of vaccines on sustainable development goals and the role of EM in achieving SDGs. In addition to using the theoretical framework, we have also used quantitative and empirical results that have rarely been seen in other articles.
Figures
PreviousNext
Article
Open Access August 31, 2022

Extended Rule of Five and Prediction of Biological Activity of peptidic HIV-1-PR Inhibitors

Abstract In this research work, we have applied “Lipinski’s RO5” for pharmacokinetics (PK) study and to predict the activity of peptidic HIV-1 protease inhibitors. Peptidic HIV-1-PRIs have been taken from literature with their observed biological activities (OBAs) in term of IC50. The logarithms of the inverse of IC50 have been used as biological end point o(log1/C) in the study. For calculation of [...] Read more.
In this research work, we have applied “Lipinski’s RO5” for pharmacokinetics (PK) study and to predict the activity of peptidic HIV-1 protease inhibitors. Peptidic HIV-1-PRIs have been taken from literature with their observed biological activities (OBAs) in term of IC50. The logarithms of the inverse of IC50 have been used as biological end point o(log1/C) in the study. For calculation of physicochemical parameters, the molecular modeling and geometry optimization of all the derivatives have been carried out with CAChe Pro software using semiempirical PM3 method. Prediction of the biological activity of the inhibitors has shown that the best QSAR model is constructed from pharmacokinetic properties, molecular weight and hydrogen bond acceptor. This also proved that these properties play important role to describe the PKs of the drugs. On the basis of the derived models one can build up a theoretical basis to access the biological activity of the compounds of the same series.
Figures
PreviousNext
Article
Open Access May 20, 2021

Bioconcentration Factor of Polychlorinated Biphenyls and Its Correlation with UV- and IR-Spectroscopic data: A DFT based Study

Abstract Polychlorinated biphenyls (PCBs) are important class of persist organic pollutants that were used as a component of paints especially in printings, as plastificator of plastics and insulating materials in transformers and capacitors, heat transfer fluids, additives in hydraulic fluids in vacuum and turbine pumps. There is always a need to establish reliable procedures for predicting the [...] Read more.
Polychlorinated biphenyls (PCBs) are important class of persist organic pollutants that were used as a component of paints especially in printings, as plastificator of plastics and insulating materials in transformers and capacitors, heat transfer fluids, additives in hydraulic fluids in vacuum and turbine pumps. There is always a need to establish reliable procedures for predicting the bioconcentration potential of chemicals from the knowledge of their molecular structure, or from readily measurable properties of the substance. Hence, correlation and prediction of biococentration factors (BCFs) based on λmax and vibration frequencies of various bonds viz υ(C-H) and υ(C=C) of biphenyl and its fifty-seven derivatives have been made. For the study, the molecular modeling and geometry optimization of the PCBs have been performed on workspace program of CAChe Pro 5.04 software of Fujitsu using DFT method. UV-visible spectra for each compound were created by electron transition between molecular orbitals as electromagnetic radiation in the visible and ultraviolet (UV-visible) region is absorbed by the molecule. The energies of excited electronic states were computed quantum mechanically. IR spectra of transitions for each compound were created by coordinated motions of the atoms as electromagnetic radiation in the infrared region is absorbed by the molecule. The force necessary to distort the molecule was computed quantum mechanically from its equilibrium geometry and thus frequency of vibrational transitions was predicted. Project Leader Program associated with CAChe has been used for multiple linear regression (MLR) analysis using above spectroscopic data as independent variables and BCFs of PCBs as dependent variables. The reliability of correlation and predicting ability of the MLR equations (models) are judged by R2, R2adj, se, q2L10O and F values. This study reflected clearly that UV and IR spectroscopic data can be used to predict BCFs of a large number of related compounds within limited time without any difficulty.
Figures
PreviousNext
Editorial Article
Open Access May 12, 2021

Into the Secrets of Jazz Arranging: Chromatic Scale in Different Harmonic Contexts

Abstract Herein we introduce a reliable and effective method, allowing any musician, regardless of the theoretical background, to carry out a 4-way jazz harmonization of whatever melodic progression almost instantly, with few exceptions. Many jazz students experience a deep frustration in dealing with the harmonization of non-diatonic notes. Sometimes, moreover, a coherent harmonization of the [...] Read more.
Herein we introduce a reliable and effective method, allowing any musician, regardless of the theoretical background, to carry out a 4-way jazz harmonization of whatever melodic progression almost instantly, with few exceptions. Many jazz students experience a deep frustration in dealing with the harmonization of non-diatonic notes. Sometimes, moreover, a coherent harmonization of the aforementioned notes can turn out to be a very challenging task even for extremely skilled professionals. In this paper, the harmonization of the chromatic scale in different harmonic contexts is accurately discussed, by resorting to the well-known concepts of harmonic functions, tonicization, chromatic and diatonic parallelism, and auxiliary chords. All the chords are labelled so as to allow the reader to immediately understand their role in the particular harmonic context. Consequently, the procedure essentially translates into an optimization of the "harmonic flow".
Figures
PreviousNext
Editorial Article
Open Access September 04, 2025

Evidence-Based Protocols for the Prevention and Treatment of Prosthetic Joint Infection in Total Hip Arthroplasty: A Systematic Review

Abstract Objective: This systematic review aimed to identify, synthesize, and critically analyze the available evidence on clinical protocols used for the prevention and treatment of prosthetic joint infection (PJI) in total hip arthroplasty (THA), based on studies published between 2000 and 2025. Methods: The review was conducted according to PRISMA guidelines. Electronic searches were performed in PubMed (MEDLINE), Scopus, Web of Science, and Embase between January and April 2025. Eligible studies included clinical trials, cohort studies, case-control studies, systematic reviews, and meta-analyses published in English that addressed either preventive or therapeutic strategies for PJI in THA. Study selection, data extraction, and quality assessment were carried out independently by two reviewers. Due to the heterogeneity of the included studies, a qualitative synthesis was performed. Results: A total of 32 studies were included. Preventive measures identified in the literature comprised combined antibiotic prophylaxis (cefazolin and gentamicin), multimodal perioperative protocols such as ACERTO, nasal decolonization for Staphylococcus aureus [...] Read more.
Objective: This systematic review aimed to identify, synthesize, and critically analyze the available evidence on clinical protocols used for the prevention and treatment of prosthetic joint infection (PJI) in total hip arthroplasty (THA), based on studies published between 2000 and 2025. Methods: The review was conducted according to PRISMA guidelines. Electronic searches were performed in PubMed (MEDLINE), Scopus, Web of Science, and Embase between January and April 2025. Eligible studies included clinical trials, cohort studies, case-control studies, systematic reviews, and meta-analyses published in English that addressed either preventive or therapeutic strategies for PJI in THA. Study selection, data extraction, and quality assessment were carried out independently by two reviewers. Due to the heterogeneity of the included studies, a qualitative synthesis was performed. Results: A total of 32 studies were included. Preventive measures identified in the literature comprised combined antibiotic prophylaxis (cefazolin and gentamicin), multimodal perioperative protocols such as ACERTO, nasal decolonization for Staphylococcus aureus, silver-impregnated dressings, and structured post-discharge surveillance. Treatment strategies included DAIR (Debridement, Antibiotics, and Implant Retention), the DAPRI technique, one-stage and two-stage revision surgeries, muscle flap reconstructions, and protocols without spacers. These interventions were associated with significantly reduced infection rates and improved clinical outcomes when applied appropriately and in accordance with patient-specific factors. Conclusion: Effective prevention and treatment of PJI in total hip arthroplasty require a systematic and evidence-based approach. Integrated protocols—spanning preoperative optimization, meticulous intraoperative techniques, and rigorous postoperative monitoring—have proven effective in reducing infection incidence. In cases of established infection, surgical management must be tailored to the timing of infection, microbial profile, and host conditions. Two-stage revision remains the gold standard for complex infections, while one-stage revision and emerging techniques like DAPRI offer promising results in selected cases. This review contributes to the standardization of clinical practice and supports improved patient outcomes.
Systematic Review
Open Access April 10, 2025

Advancements in Pharmaceutical IT: Transforming the Industry with ERP Systems

Abstract The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data [...] Read more.
The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data integration, contributing significantly to operational efficiency and organizational agility. This paper explores the evolution and impact of ERP systems within the pharmaceutical sector, highlighting their contributions to overcoming the industry’s inherent challenges, including complex regulatory requirements, the need for accurate and real-time data, and the demand for supply chain resilience. The integration of cloud-based ERP solutions, the incorporation of emerging technologies like Artificial Intelligence (AI), Machine Learning (ML), and the Internet of Things (IoT), and enhanced data analytics capabilities have revolutionized pharmaceutical IT. These advancements not only reduce operational costs, improve forecasting accuracy, and enhance collaboration but also ensure compliance with stringent global regulations, such as Good Manufacturing Practices (GMP) and FDA guidelines. Moreover, ERP systems have been instrumental in managing the pharmaceutical supply chain, ensuring product traceability, and improving inventory control and order fulfillment processes. This manuscript examines how ERP systems enable pharmaceutical companies to maintain high standards of product quality, improve decision-making, and ensure the safety and efficacy of drugs through robust tracking and auditing mechanisms. A case study of a pharmaceutical company that implemented an ERP system demonstrates the tangible benefits, including increased operational efficiency, improved compliance rates, and enhanced customer satisfaction. However, despite the clear advantages, challenges such as customization complexities, data integration issues, and resistance to change remain. As the pharmaceutical industry continues to evolve, ERP systems will remain a cornerstone of digital transformation, facilitating smarter decision-making, better resource management, and enhanced collaboration across global operations. This paper also identifies future trends, including the potential of AI and blockchain technologies in further strengthening ERP systems and transforming the pharmaceutical landscape.
Review Article
Open Access January 09, 2025

Advances in the Synthesis and Optimization of Pharmaceutical APIs: Trends and Techniques

Abstract The synthesis and optimization of Active Pharmaceutical Ingredients (APIs) is fundamental to pharmaceutical drug development, directly influencing drug efficacy, safety, and cost-effectiveness. Over recent years, significant advancements in synthetic methodologies and manufacturing technologies have transformed API production. This manuscript provides an overview of the latest innovations in API [...] Read more.
The synthesis and optimization of Active Pharmaceutical Ingredients (APIs) is fundamental to pharmaceutical drug development, directly influencing drug efficacy, safety, and cost-effectiveness. Over recent years, significant advancements in synthetic methodologies and manufacturing technologies have transformed API production. This manuscript provides an overview of the latest innovations in API synthesis, focusing on key techniques such as green chemistry, continuous flow chemistry, biocatalysis, and automation. Green chemistry principles, including solvent substitution and catalytic reactions, have enhanced sustainability by reducing waste and energy consumption. Continuous flow chemistry offers improved reaction control, scalability, and safety, while biocatalysis provides an eco-friendly alternative for synthesizing complex and chiral APIs. Additionally, the integration of automation and advanced process control using machine learning and real-time monitoring has optimized production efficiency and consistency. The manuscript also discusses the challenges associated with regulatory compliance and quality assurance, highlighting the role of advanced analytical techniques such as HPLC, NMR, and mass spectrometry in ensuring API purity. Looking ahead, personalized medicine and smart manufacturing technologies, including blockchain for traceability, are expected to drive further innovation in API production. This review concludes by emphasizing the need for continued advancements in sustainability, efficiency, and scalability to meet the evolving demands of the pharmaceutical industry, ultimately enabling the development of safer, more effective, and environmentally responsible medicines.
Review Article
Open Access November 07, 2024

Optimizing Pharmaceutical Supply Chain: Key Challenges and Strategic Solutions

Abstract Pharmaceutical supply chains are critical to ensuring the availability of safe and effective medications, yet they face numerous challenges that can jeopardize public health. This article provides a comprehensive analysis of the key issues impacting pharmaceutical supply chains, including regulatory compliance, demand forecasting, supply chain visibility, quality assurance, and geopolitical risks. [...] Read more.
Pharmaceutical supply chains are critical to ensuring the availability of safe and effective medications, yet they face numerous challenges that can jeopardize public health. This article provides a comprehensive analysis of the key issues impacting pharmaceutical supply chains, including regulatory compliance, demand forecasting, supply chain visibility, quality assurance, and geopolitical risks. Regulatory compliance remains a significant concern due to the stringent guidelines imposed by authorities such as the FDA and EMA, which can lead to increased operational costs and time delays. Additionally, traditional demand forecasting methods often fail to accurately predict fluctuations in drug demand, resulting in stockouts or excess inventory. Limited supply chain visibility further complicates these challenges, hindering timely decision-making and operational efficiency. Quality assurance is paramount, as maintaining the integrity of pharmaceutical products throughout the supply chain is crucial to preventing costly recalls and ensuring patient safety. Moreover, the globalization of supply chains introduces vulnerabilities to geopolitical risks, trade disputes, and natural disasters. In response to these issues, this article outlines strategic recommendations for optimizing pharmaceutical supply chains. These include leveraging advanced analytics and IoT technologies to enhance demand forecasting and visibility, strengthening compliance through automated systems and training, fostering collaboration among stakeholders, implementing robust risk management frameworks, and investing in quality management systems. By adopting these strategies, pharmaceutical companies can enhance the efficiency and resilience of their supply chains, ultimately ensuring the continuous availability of essential medications for patients worldwide. This analysis serves as a critical resource for industry professionals seeking to navigate the complexities of pharmaceutical supply chains in an increasingly dynamic global environment.
Review Article
Open Access November 01, 2024

Impacts of Drug Shortages in the Pharmaceutical Supply Chain

Abstract Drug shortages represent a significant and growing challenge within the pharmaceutical supply chain, with profound implications for patient care, public health, and healthcare costs. This manuscript provides a comprehensive examination of the causes and impacts of drug shortages, highlighting the multifaceted nature of this issue. Key factors contributing to shortages include manufacturing [...] Read more.
Drug shortages represent a significant and growing challenge within the pharmaceutical supply chain, with profound implications for patient care, public health, and healthcare costs. This manuscript provides a comprehensive examination of the causes and impacts of drug shortages, highlighting the multifaceted nature of this issue. Key factors contributing to shortages include manufacturing complications, limited availability of active pharmaceutical ingredients (APIs), market dynamics that discourage the production of less profitable medications, and regulatory challenges that slow down the approval process for new manufacturing capacities. The consequences of these shortages are far-reaching. Patients often face treatment delays, which can lead to adverse health outcomes, increased hospitalization rates, and even mortality. Healthcare providers experience heightened operational costs as they seek alternative therapies and manage complications resulting from inadequate treatment. Furthermore, the frequent occurrence of drug shortages erodes public trust in both the healthcare system and the pharmaceutical industry, leading to decreased patient adherence to prescribed therapies. To mitigate the impacts of drug shortages, this manuscript proposes several strategic solutions, including enhanced communication among stakeholders, diversification of supply sources, increased regulatory flexibility, and collaborative approaches between public and private sectors. Additionally, raising awareness among healthcare providers and patients regarding the causes and potential alternatives can empower stakeholders to navigate shortages effectively. Ultimately, addressing drug shortages necessitates a proactive and coordinated effort from all participants in the pharmaceutical supply chain. By implementing these strategies, stakeholders can enhance the resilience of the supply chain, ensuring that essential medications remain accessible and that patient care is not compromised. The findings of this manuscript underscore the urgent need for ongoing vigilance and collaborative action to tackle the challenges posed by drug shortages, safeguarding public health and improving healthcare outcomes globally.
Review Article
Open Access March 30, 2024

Essence Control of Active Pharmaceutical Ingredients

Abstract Active Pharmaceutical Ingredients (APIs) form the backbone of pharmaceutical formulations, influencing their efficacy, safety, and stability. Essence control of APIs involves stringent regulation and optimization of their chemical, physical, and biological properties to ensure consistent quality and therapeutic outcomes. This manuscript explores the critical aspects of essence control in APIs, [...] Read more.
Active Pharmaceutical Ingredients (APIs) form the backbone of pharmaceutical formulations, influencing their efficacy, safety, and stability. Essence control of APIs involves stringent regulation and optimization of their chemical, physical, and biological properties to ensure consistent quality and therapeutic outcomes. This manuscript explores the critical aspects of essence control in APIs, including synthesis, characterization, quality assessment, and regulatory considerations. The synthesis of Active Pharmaceutical Ingredients is a pivotal stage in pharmaceutical manufacturing, where precise control over chemical reactions and process conditions is paramount to achieving high-quality, safe, and effective medicines. Advances in synthetic methodologies, optimization strategies, sustainability practices, and the implementation of PAT technologies continue to drive innovation in API synthesis, supporting the development of novel therapeutic agents and enhancing pharmaceutical manufacturing efficiency.
Review Article
Open Access July 16, 2024

Management of Saltwater Intrusion in Coastal Aquifers: A Review and Case Studies from Egypt

Abstract Groundwater is undeniably crucial to people's lives, particularly in coastal regions. Therefore, it is imperative to address this vital water source strategically and implement a management plan to maintain its optimal state. The salinization of groundwater poses a significant challenge for coastal communities, stemming from factors like excessive groundwater extraction from coastal aquifers, [...] Read more.
Groundwater is undeniably crucial to people's lives, particularly in coastal regions. Therefore, it is imperative to address this vital water source strategically and implement a management plan to maintain its optimal state. The salinization of groundwater poses a significant challenge for coastal communities, stemming from factors like excessive groundwater extraction from coastal aquifers, reduced recharge, rising sea levels, climate change, and other causes. Saltwater intrusion (SWI) is a prevalent issue that needs attention, as it significantly threatens groundwater quantity and quality. SWI happens when saline water infiltrates coastal aquifers, contaminating freshwater supplies. This review article aims to define SWI, explore its causes and influencing factors, and discuss various monitoring techniques. Additionally, it examines different modeling methods and management tools, including remote sensing, field surveys, modeling approaches, and optimization techniques. To mitigate the adverse effects of SWI, several control measures are outlined, along with their pros and cons. The final section reviews previous SWI studies and case studies from the Nile Delta, Sinai Peninsula, and North-West coast in Egypt. These studies offer suggestions, adaptations, and mitigation measures for future research.
Figures
PreviousNext
Review Article
Open Access April 24, 2024

Optimization of Delirium Care in Adult Patients with Cancer: A Comprehensive and Integrative Review of Efficacy and Patient Outcomes

Abstract Delirium is a major complication most commonly observed in patients with advanced cancer. However, despite its prevalence, the early diagnosis, management, and prevention of this condition have not seen significant progress. Aim of this research is to provide insights into the prevalence of delirium, the optimization of interventions for managing delirium symptoms, their effectiveness and the [...] Read more.
Delirium is a major complication most commonly observed in patients with advanced cancer. However, despite its prevalence, the early diagnosis, management, and prevention of this condition have not seen significant progress. Aim of this research is to provide insights into the prevalence of delirium, the optimization of interventions for managing delirium symptoms, their effectiveness and the impact of underlying factors on the reversibility of delirium in advanced cancer patients receiving palliative care. The review involved systematic searches of relevant databases including MEDLINE, CINAHL, ProQuest Nursing and Allied Health, and PsychInfo using refined search terms. Eight publications out of 614 studies originally searched were selected and critically reviewed. Their quality was assessed using Joanna Briggs Institute's Critical Appraisal Tool for Case Series. Data abstraction and content analysis were performed to synthesize the findings. Delirium is prevalent among advanced cancer patients in palliative care, with rates ranging from 10.3% to 24.1%. Pharmacotherapy and non-pharmacological interventions showed effectiveness in reducing delirium symptoms. Delirium was found to be reversible through palliative care interventions, antipsychotic medications, and exercise therapy. Effective delirium management is crucial in improving the quality of life of cancer patients. This review emphasizes the importance of subtype-specific treatments, standardized guidelines, and long-term follow-up studies. Implementing evidence-based individualized approaches to delirium management can optimize treatment efficacy and clinical outcomes in patients as well as improve the quality of care. Tailored interventions, standardized protocols, and further research are hereby recommended.
Figures
PreviousNext
Review Article
Open Access April 11, 2024

5V’s of Big Data Shifted to Suite the Context of Software Code: Big Code for Big Software Projects

Abstract Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the [...] Read more.
Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the codebases of the software’s are increasingly growing that is the size of the modules, functionalities, the size of the classes etc. Since data is growing so rapidly it also mean the codebases of software’s or code are also growing as well. Therefore, this paper seeks to discuss the 5V’s of big data in the context of software code and how to optimize or manage the big code. When we talk of "Big Code for Big Software's," we are referring to the specific challenges and considerations involved in developing, managing, and maintaining of code in large-scale software systems.
Article
Open Access September 04, 2022

Drug-Receptor Interaction of Peptidic HIV-1 Protease: The Hydrophobic Effect-I

Abstract When a drug interacts with its receptor, the nonpolar substituent of drug and receptor proteins attract each other because they have opposite magnitude with respect to each other. X-rays structure studies reflected that the S2/S2’ pocket in HIV-1 protease enzyme are essentially hydrophobic. The residues that make up these pockets are Val-32, Ile-47, Ile-50, and Ile-84 in each monomeric [...] Read more.
When a drug interacts with its receptor, the nonpolar substituent of drug and receptor proteins attract each other because they have opposite magnitude with respect to each other. X-rays structure studies reflected that the S2/S2’ pocket in HIV-1 protease enzyme are essentially hydrophobic. The residues that make up these pockets are Val-32, Ile-47, Ile-50, and Ile-84 in each monomeric polypeptidic unit of the protease enzyme. Δπdr and ΔSASAdr have been used to measure the extent of hydrophobic interaction between peptidic protease inhibitors and receptor proteins (binding site: valine‒isoleucine; and catalytic site: glycine‒aspartic acid‒threonine) on the HIV-1 protease enzyme. For measurement of hydrophobic interaction, the molecular modeling and geometry optimization of all the inhibitors and the receptor amino acids have been carried out with CAChe Pro software by opting semiempirical PM3 methods. Log P was calculated using the atom-typing scheme of Ghose and Crippen, while solvent accessible surface area by conductor likes screening model. πd, πr, SASASd and SASASr well describe the hydrophobicities of the substituents and play the effective role for site selectivity for interaction of the drug with the receptor. Comparative study of values of Δπdr and ΔSASAdr show the order of hydrophobic interaction with respect to amino acids: Asp > Thr > Val > Ile and Thr > Val > Asp > Ile, respectively. Further, comparative study of the values of (ΣΔπdr)binding-site, (ΣΔπdr)catalytic-site, (ΣΔSASAdr)binding-site, (ΣΔSASAdr)catalytic-site shows that peptidic HIV-1-PRIs interact with binding site rather than catalytic site as binding site have lower value of ΣΔπdr and ΣΔSASAdr. Among the binding site, Val has maximum interaction than Ile, as it has lower vale of Δπdr and ΔSASAdr.
Figures
PreviousNext
Article
Open Access July 22, 2022

DFT-Based Prediction of Anti-Leishmanial Activity of Carboxylates and Their Antimony(III) Complexes Against Five Leishmanial Strains

Abstract Carboxylates and their antimony(III) complexes experimentally scanned earlier for anti-leishmanial activity (IC50) against five leishmanial strains viz., L. major, L. major (Pak), L. tropica, L. mex mex, and L. donovani. These activities have been theoretically predicted by DFT method along with quantitative structure-activity relationship (QSAR) study. Molecular modeling and geometry optimization of the all the eight compounds have been performed on workspace program of CAChe Pro software of Fujitsu by opting B88-PW91 (Becke '88; Perdew & Wang '91) GGA (generalized-gradient approximation) energy functional with DZVP (double-zeta valence polarized ) basis set in DFT (Density Functional Theory). For QSAR, multiple linear regression (MLR) analysis has been performed on Project Leader Program associated with CAChe. The reliability of correlation between experimental activities and predicted activities are r2 = 0.826, r2CV = 0.426 (L. major); r2 = 0.905, r2CV = 0.507 (L. major (Pak)); r2 = 0.980, r2CV = 0.932 (L. tropica); r2 = 0.781, r2CV = 0.580 (L. mex mex) and r2 = 0.634, r2CV = 0.376 (L. donovani [...] Read more.
Carboxylates and their antimony(III) complexes experimentally scanned earlier for anti-leishmanial activity (IC50) against five leishmanial strains viz., L. major, L. major (Pak), L. tropica, L. mex mex, and L. donovani. These activities have been theoretically predicted by DFT method along with quantitative structure-activity relationship (QSAR) study. Molecular modeling and geometry optimization of the all the eight compounds have been performed on workspace program of CAChe Pro software of Fujitsu by opting B88-PW91 (Becke '88; Perdew & Wang '91) GGA (generalized-gradient approximation) energy functional with DZVP (double-zeta valence polarized ) basis set in DFT (Density Functional Theory). For QSAR, multiple linear regression (MLR) analysis has been performed on Project Leader Program associated with CAChe. The reliability of correlation between experimental activities and predicted activities are r2 = 0.826, r2CV = 0.426 (L. major); r2 = 0.905, r2CV = 0.507 (L. major (Pak)); r2 = 0.980, r2CV = 0.932 (L. tropica); r2 = 0.781, r2CV = 0.580 (L. mex mex) and r2 = 0.634, r2CV = 0.376 (L. donovani), and a comparison of the experimental values and the values obtained by theoretical calculations has been presented pictorially that shows close resemblance.
Figures
PreviousNext
Article
Open Access September 30, 2021

Synthesis, Characterization and Catalytic Application of Magnetic Iron Nanoparticles (Fe3o4) in Biodiesel Production from Mahogany (Khaya Senegalensis) Seed Oil

Abstract Magnetic iron nanoparticles (Fe3O4) were synthesized and characterized using Fourier Transformed Infrared ((FT-IR), UV-Visible spectrophotometer, Scanned Electron Microscopy (SEM) equipped with an Energy Dispersive X-ray spectrometer (EDX), and X-ray Diffraction (XRD). The synthesized nano catalyst was used in the transesterification of mahogany seed oil with methanol. The [...] Read more.
Magnetic iron nanoparticles (Fe3O4) were synthesized and characterized using Fourier Transformed Infrared ((FT-IR), UV-Visible spectrophotometer, Scanned Electron Microscopy (SEM) equipped with an Energy Dispersive X-ray spectrometer (EDX), and X-ray Diffraction (XRD). The synthesized nano catalyst was used in the transesterification of mahogany seed oil with methanol. The optimized reaction conditions gave a reaction yield of 88% at a catalyst concentration of 1.5% wt., a volume ratio of methanol to oil of 5:1, a reaction temperature of 60 °C, and a reaction time of 120 minutes. The Fe3O4 nanoparticles was regenerated from the mixture and reused for various circles by applying the optimum conditions obtained during the present study. The results showed that the biodiesel yield decreased by increasing the number of cycles when the regenerated catalyst was used. However, good conversion (81.9%) was obtained up to the 5th cycles. The elemental analysis of the synthesized magnetic iron nanoparticles Fe3O4) revealed the highest proportion of iron with 64.37 and 74.40% for atomic and weight concentration respectively, followed by oxygen with 34.27 and 24.50% for atomic and weight concentrations respectively. It could be concluded that the synthesized nano catalyst would serve as an excellent catalyst for the transesterification of vegetable oils.
Figures
PreviousNext
Article
Open Access September 25, 2021

Performance Analysis of KPI's of a 4G Network in a Selected Area of Port Harcourt, Nigeria

Abstract The introduction of 4G LTE communication technology was basically designed to meet the increasing demand by users for high-quality multimedia services, data communication speed and improved quality of service (QOS). It is pertinent to note that, with an ever-increasing subscriber base, it is essential to assess and analyze the network performance. To perform this task, there is a need to use the [...] Read more.
The introduction of 4G LTE communication technology was basically designed to meet the increasing demand by users for high-quality multimedia services, data communication speed and improved quality of service (QOS). It is pertinent to note that, with an ever-increasing subscriber base, it is essential to assess and analyze the network performance. To perform this task, there is a need to use the key performance indicators (KPI). This research study evaluates KPI’s gathered from field measurements, using a statistical approach to establish the performance and determine the present condition of the quality of service offered by a 4G LTE network in Port Harcourt, Nigeria. In this study, a drive test approach was adopted to measure the KPI’s and analysis was achieved with the use of TEMs Discovery software adopting a statistical approach. The result showed the value range of the measured KPI’s were; RSSI (-90, -49.7dBm), RSRP (-117.7, -68.6 dBm), RSRQ (-14.2, -22.8dB) representing minimum and maximum values. The probability distribution of the various KPI’s showed that the best signal ranges were distributed as 38.21%, 69.63% and 65.63% for RSSI, RSRP and RSRQ respectively. The KPI parameters were within the acceptable range, though require optimization to provide better service for a greater population.
Figures
PreviousNext
Article
Open Access September 23, 2021

Distributed Generation and Optimization of smart Grid Systems: Case Study of Kumba in Cameroon

Abstract The traditional electric grid of the City of Kumba has been experiencing a constant failure which leads inhabitant to experience constant blackout. This constant blackout persists and stays for a long time due to the lack of communication between equipment, consumer and supplier. Whenever there is a fault, the repairing agents walk along the feeder to find the fault. This manual fault finding [...] Read more.
The traditional electric grid of the City of Kumba has been experiencing a constant failure which leads inhabitant to experience constant blackout. This constant blackout persists and stays for a long time due to the lack of communication between equipment, consumer and supplier. Whenever there is a fault, the repairing agents walk along the feeder to find the fault. This manual fault finding increases the restauration time which leads to the augmentation of the blackout period. Factors responsible for the failure of the line are complex to be controlled. It is necessary to reduce restauration time by introducing Information and Communication Technologies (ICT) and sensing system in the grid and making it to be smart. ICT in this smart grid, sensors and smart meters are meant to assure two-way communication between the supplier and the consumer. They send real time information which is computed at the control center to optimize the entire grid. Distributed generation is also introduced in the system for two purposes. To complete the lag in power demand of the grid and to take over the supply when the main feeder is faulty. Various distributed generation sources studied led to the choice of solar power plants thanks to their low production of Greenhouse Gas (GHG) and availability of their resources in the city. A model has been proposed for the distributed generation and optimization of the smart grid. The system indexes obtained without distributed generation in the grid are different from that with. The difference in these indexes proved that the grid has been optimized. However, the reliability of the grid is enhanced after the introduction of distributed generation into the system. This enhancement in reliability declares that with distributed generation into the grid, the population of Kumba has a reliable power supply, which makes them to have energy throughout.
Figures
PreviousNext
Article
Open Access August 14, 2021

Complex Energy Conversion System Analysis: An Overview

Abstract This article describes the optimization models recently applied to the design and operation of power systems towards forming smart grids and identifies trends, barriers, and possible gaps in this area. Models are described to optimize the design and operation of power systems considering renewable energies, distributed generation, microgrids, demand management, and energy storage systems. It was [...] Read more.
This article describes the optimization models recently applied to the design and operation of power systems towards forming smart grids and identifies trends, barriers, and possible gaps in this area. Models are described to optimize the design and operation of power systems considering renewable energies, distributed generation, microgrids, demand management, and energy storage systems. It was concluded that it is necessary to validate many of the models formulated recently to optimize the operation through tests with real data and on a large scale. Furthermore, demand management and microgrids are aspects in which it is necessary to develop models for optimal power flow. Finally, it is necessary to predict stochastic variables with greater precision so that these models adapt to the real behavior of the system.
Figures
PreviousNext
Article
Open Access August 09, 2021

Investigation of the Optimal Model for the Development of Renewable Energy in Iran using a Robust Optimization Approach

Abstract Due to its geographical location, Iran has numerous capacities in renewable energy, and this issue has made the need to develop renewable energy on the authorities’ agenda. This underscores the need to provide an optimal model for developing renewable energy. Therefore, in this study, the main purpose was to provide an optimal renewable energy model. In line with this goal, by choosing the cost [...] Read more.
Due to its geographical location, Iran has numerous capacities in renewable energy, and this issue has made the need to develop renewable energy on the authorities’ agenda. This underscores the need to provide an optimal model for developing renewable energy. Therefore, in this study, the main purpose was to provide an optimal renewable energy model. In line with this goal, by choosing the cost function as the objective function and considering the potential constraints of renewable energy (resource constraints), the amount of electricity consumption in each of the 16 electricity regions (demand constraint) and the limitation of renewable energy production coefficient (Technical constraints), the optimal model of renewable energy use was designed and solved using a solid programming model in LINGO software. The optimal model results show 15.19% small hydropower, 24.30% wind energy, 5.52% biomass energy, 6.13% is geothermal energy, 4.79% is tidal energy, and 44.07% solar energy. The optimum portfolio of renewable energy is estimated in this paper using the robust optimization approach. The results showed which renewable technology has the greater potential to take more share of the energy portfolio. The results of this investigation help policymakers to choose the most suitable renewable technologies to support.
Figures
PreviousNext
Article
Open Access August 09, 2021

Optimization and Prediction of Biodiesel Yield from Moringa Seed Oil and Characterization

Abstract In this study, oil was extracted from Moringa seed using mechanical and solvent methods. To transesterify the oil into biodiesel, factorial design of experiment of 24 was used to obtain different combination factors at different level of reaction temperature, catalyst amount, reaction time and alcohol to oil ratio, giving rise to 48 experimental runs. The oil sample was transesterified [...] Read more.
In this study, oil was extracted from Moringa seed using mechanical and solvent methods. To transesterify the oil into biodiesel, factorial design of experiment of 24 was used to obtain different combination factors at different level of reaction temperature, catalyst amount, reaction time and alcohol to oil ratio, giving rise to 48 experimental runs. The oil sample was transesterified in 48 experimental runs, in each case the biodiesel yield was recorded in percentage. The biodiesel was then characterized according to ASTM test protocol. Factorial design model was developed using Design Expert 7.0, the model generated R of 0.987 and Mean Square Error (MSE) of 5.0453 and was used to predict and optimize biodiesel yield. Artificial Neural Network (ANN) model from MATLAB R2016a was developed using 4 input variables and 30 runs, the remaining 18 runs were tested with the ANN model to predict and compare the biodiesel yield with the experimental biodiesel yield, the model generated R value of 0.99687 and MSE of 3.50804. It was found that solvent method yielded more oil than mechanical method, the biodiesel has good thermo-physical property, optimum biodiesel yield of 91.45 % was obtained at 5:1 alcohol/ oil molar ratio, 18.89 wt% catalyst amounts, 45 minutes reaction time and at 45 reaction temperature. The experimental validation yielded 88.33 % biodiesel. The ANN model adequately predicted the remaining 18 runs with R2 value of 0.99649 and MSE of 4.914243. Both models proved adequate enough to predict biodiesel yield but ANN model proved more adequate.
Figures
PreviousNext
Article
Open Access July 17, 2021

DFT-Based Study of Physical, Chemical and Electronic Behavior of Liquid Crystals of Azoxybenzene Group: p-azoxyanisole, p-azoxyphenetole, ethyl-p-azoxybenzoate, ethyl-p-azoxycinnamate and n-octyl-p-azoxycinnamate

Abstract The present work describes the geometry and electronic structures of liquid crystals of azoxybenzene group and their reactivity with respect to molecular properties: total energy, ionization potential, electron affinity, HOMO energy, LUMO energy, electronegativity, hardness and dipole moment. Literature shows that mesomorphism depends particularly on the nature of terminal groups and their [...] Read more.
The present work describes the geometry and electronic structures of liquid crystals of azoxybenzene group and their reactivity with respect to molecular properties: total energy, ionization potential, electron affinity, HOMO energy, LUMO energy, electronegativity, hardness and dipole moment. Literature shows that mesomorphism depends particularly on the nature of terminal groups and their linkages with parent molecule. And thus, substitution of terminal groups can help to fine tune the liquid crystal behavior and also their applications. In this work the effect of four terminal groups of same and diverse nature has been studied. For the study, the molecular modeling and geometry optimization of the compounds have been performed on workspace program of CAChe Pro 5.04 software of Fujitsu using DFT method.
Figures
PreviousNext
Article
Open Access August 29, 2022

From Deterministic to Data-Driven: AI and Machine Learning for Next-Generation Production Line Optimization

Abstract The advancement of modern manufacturing is synonymous with the growth of automation. Automation replaces human operators, improves productivity and quality, and reduces costs. However, the initial financial cost and knowledge requirements can be barriers to embracing automation. Manufacturers are now seeking smart manufacturing, known as the fourth industrial revolution. Smart manufacturing goes [...] Read more.
The advancement of modern manufacturing is synonymous with the growth of automation. Automation replaces human operators, improves productivity and quality, and reduces costs. However, the initial financial cost and knowledge requirements can be barriers to embracing automation. Manufacturers are now seeking smart manufacturing, known as the fourth industrial revolution. Smart manufacturing goes beyond automation and utilizes IoT, AI, and big data for optimized production. In a smart factory, production can be linked and controlled innovatively, leading to increased performance, agility, and reduced costs.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Leveraging AI in Urban Traffic Management: Addressing Congestion and Traffic Flow with Intelligent Systems

Abstract Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. [...] Read more.
Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. From an urban transportation standpoint, an immediate consideration on one hand is monitoring traffic conditions and demand cycles, while on the other hand inducing flow modifications that benefit the traffic network and mitigate congestion. Embedded and centralized control systems that characterize modern traffic management systems extract traffic conditions specific to their regions but lack communication between networks. Moreover, innovative methods are required to provide more accurate up-to-date traffic forecasts that characterize real-world traffic dynamics and facilitate optimal traffic management decisions. In this chapter, we briefly outline the main difficulties and complexities in modeling, managing, and forecasting traffic dynamics. We also compare various conventional and modern Intelligent Transportation Strategies in terms of accuracy and applicability, their performance, and potential opportunities for optimization of multimodal traffic flow and congestion reduction. This chapter introduces various proposed data-driven models and tools employed for traffic flow prediction and management, investigating specific strategies' strengths, weaknesses, and benefits in addressing various real-world traffic management problems. We describe that the design phase of dependable Intelligent Transportation Systems bears unique requirements in terms of the robustness, safety, and response times of their components and the encompassing system model. Furthermore, this architectural blueprint shares similarities with distributed coordinate searching and collective adaptive systems. Town size-independent models induce systemic performance improvements through reconfigurable embedded functionality. These AI techniques feature elaborate anytime planner-engagers ensuring near-optimal performances in an unbiased behavior when the model complexity is varied. Sustainable models minimize congestion during peaks, flooding, and emergency occurrences as they adhere to area-specific regulations. Security-aware and fail-safe traffic management systems relinquish reasonable assurances of persistent operation under various environmental settings, to acknowledge metropolis and complex traffic junctions. The chapter concludes by outlining challenges, research questions, and future research paths in the field of transportation management.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Financial Implications of Predictive Analytics in Vehicle Manufacturing: Insights for Budget Optimization and Resource Allocation

Abstract Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented [...] Read more.
Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented questions need to at least partially guide the decisions in the planning phase of data science projects. Data-driven approaches will play an increasingly important role, but only a few of the firms that were confident performed logistic regression models for predictive maintenance. Also, from the available knowledge, data-driven classification models connecting vehicle component failures and the occurrence of delays at the assembly line have not been published. This paper utilizes a real-world data-driven approach using classification models in predictive analytics by vehicle manufacturers and thereby links the financial implications of such data science projects to their results. We expand the existing literature on predictive maintenance and possess a unique dataset of newly launched series of vehicles, presented as-is. Our research context is of interest to researchers and practitioners in the automotive industry that manage and plan the final vehicle assembly with just-in-time principles, factoring the consequences of component failures on the assembly process. Key findings of this paper highlight that while minor tweaking of the models is possible, their potential input in decision-making processes for budget optimization is limited.
Figures
PreviousNext
Review Article
Open Access November 05, 2022

Application of Neural Networks in Optimizing Health Outcomes in Medicare Advantage and Supplement Plans

Abstract The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, [...] Read more.
The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, personalized treatment recommendations, and risk stratification. This paper explores the application of neural networks in enhancing health outcomes within the context of Medicare Advantage and Supplement plans. We review how deep learning models can be leveraged to predict patient risk, optimize resource allocation, and identify at-risk populations for preventive interventions. Additionally, we discuss the potential for neural networks to improve claims processing, reduce fraud, and streamline administrative burdens. By integrating various data sources, including medical records, claims data, and demographic information, neural networks enable more accurate and efficient decision-making processes. Ultimately, this approach can lead to better patient care, reduced healthcare costs, and improved satisfaction for beneficiaries of these programs. The paper concludes by highlighting the current limitations, ethical considerations, and future directions for AI adoption in the Medicare Advantage and Supplement sectors.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Enhancing Pharmaceutical Supply Chain Efficiency with Deep Learning-Driven Insights

Abstract The growing complexity of the operating environment urges pharmaceutical innovation. This essay addresses the need for the integration of advanced technologies in the pharmaceutical supply chain. It justifies the value proposition and presents a concrete use case for the integration of deep learning insights to make data-driven decisions. The supply chain has always been a priority for the [...] Read more.
The growing complexity of the operating environment urges pharmaceutical innovation. This essay addresses the need for the integration of advanced technologies in the pharmaceutical supply chain. It justifies the value proposition and presents a concrete use case for the integration of deep learning insights to make data-driven decisions. The supply chain has always been a priority for the pharmaceutical industry; research and development recognizes companies' increasing investment in big data strategies, with plans for a CAGR in big data tool adoption. The work presented herein has a preliminary explorative character to recuperate and integrate evidence from partly overlooked practical experience and know-how. The practical relevance of the essay is directed toward practitioners in pharmaceutical production, supply chain management, logistics, and regulatory agencies. The literature has shown a long-term concern for enhanced performance in the pharmaceutical supply chain network. This essay demonstrates the application of deep learning-driven insights to reveal non-evident flow dependencies. The main aim is to present a comprehensive insight into deep learning-driven decision support. The supply chain is portrayed in a holistic manner, seeking end-to-end visibility. Implications for public policy are discussed, such as data equity: many countries are protecting their populations and economic growth by building resilience and efficiency to ensure the capacity to move goods across supply chains. The implementation strategy is covered. The combined reduction of variability, efficiency as matured richness, reliability (on stochastic flows and their understanding through deep learning and data), and system noise (increased dampening through the inclusiveness of all stakeholders) results in increased responsiveness of supply chains for pharmaceutical products. Future work involves the integration of external data, closing the loop between planning and its application in reality.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Predictive Analytics and Deep Learning for Logistics Optimization in Supply Chain Management

Abstract Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the [...] Read more.
Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the operations of a supply chain. An approach is presented on how predictive analytics can be used to improve logistics operations. In order to analyze big data in logistics effectively, an artificial intelligence computational technique, specifically deep learning, is employed. Two case studies are illustrated to demonstrate the practical employability of the proposed technique. This reveals the power and potential of using predictive analytics in logistics to project various KPI values ahead in the future based on the contemporary data from the logistics operations; sheds light on the innovative technique of employing deep learning through deep learning-based predictive analytics in logistics; suggests incorporating innovative techniques like deep learning with predictive analytics to develop an accurate forecasting technique in logistics and optimize operations and prevent disruption in the supply chain. The network of supply chains has become more complex, necessitating the need for the latest technological advancements. The sectors that have gained a fair amount of attention for the application of technology to optimize their operations are manufacturing, healthcare, aerospace, and the automotive industry. A little attention has been diverted to the logistics sector; many describe how analytics and artificial intelligence can be used in the logistics sector to achieve higher optimization. Currently, significant research has been done in optimizing logistics operations. Nevertheless, with the explosive volume of historical data being produced by the logistics operations of an organization, there is a great opportunity to learn valuable insights from the data accumulated over time for more long-term strategic planning. To develop the logistics operations in an organization, the use of historical data is essential to understand the trends in the operations. For example, regular maintenance planning and resource allocation based on trends are long-term activities that will not affect logistics operations immediately but can affect the business’s strategic planning in the long run. A predictive analysis technique employed on historical data of logistics can narrow down conclusions based on the future trends of logistics operations. Thus, the technique can be used to prevent the disruption of the supply chain.
Figures
PreviousNext
Review Article
Open Access February 22, 2023

Navigating the Pharmaceutical Supply Chain: Key Strategies for Balancing Demand and Supply

Abstract The pharmaceutical industry is fundamental to global healthcare, providing essential medicines that improve health outcomes and quality of life. However, the demand and supply dynamics within this sector are highly complex, shaped by various factors including demographic changes, evolving disease burdens, technological advancements, regulatory challenges, and economic pressures. This manuscript [...] Read more.
The pharmaceutical industry is fundamental to global healthcare, providing essential medicines that improve health outcomes and quality of life. However, the demand and supply dynamics within this sector are highly complex, shaped by various factors including demographic changes, evolving disease burdens, technological advancements, regulatory challenges, and economic pressures. This manuscript explores the intricate relationship between pharmaceutical medicine demand and supply, focusing on key strategies that can help companies effectively navigate these challenges. The demand for pharmaceutical products is driven by several factors, such as population growth, the aging population, the rise of chronic diseases, and the emergence of new health threats. Additionally, healthcare accessibility, affordability, and policy changes significantly impact the consumption of medicines, while innovations in medical technologies and therapies create new treatment needs. On the supply side, pharmaceutical companies face challenges related to manufacturing capacity, raw material availability, distribution logistics, and compliance with ever-evolving global regulatory frameworks. To address these challenges, the manuscript discusses strategic approaches to managing both demand and supply in the pharmaceutical sector. Key strategies include advanced demand forecasting through data analytics, optimizing supply chains for efficiency and resilience, implementing just-in-time inventory models, and investing in flexible manufacturing systems. Furthermore, global collaboration and partnerships, as well as effective risk management practices, are highlighted as essential to ensuring the availability of medicines, particularly in times of crisis or global health emergencies. This manuscript also delves into the role of policy advocacy and regulatory harmonization in stabilizing the pharmaceutical market, ensuring that medicines are accessible to all populations. In conclusion, the pharmaceutical industry must continually adapt to meet the evolving challenges of demand and supply, embracing innovation and collaboration while maintaining a focus on patient access and global healthcare equity. Through strategic planning and adaptive solutions, the pharmaceutical sector can ensure the continuous availability of critical medicines worldwide, meeting both current and future health needs.
Case Report
Open Access July 16, 2023

Pharmaceutical Supply Chain Distribution: Mitigating the Risk of Counterfeit Drugs

Abstract The global pharmaceutical supply chain plays a crucial role in ensuring the timely and safe delivery of medicines to patients worldwide. However, the increasing presence of counterfeit drugs within this supply chain poses a significant and growing risk to public health, patient safety, and the integrity of the pharmaceutical industry. Counterfeit drugs—medications that are fraudulently [...] Read more.
The global pharmaceutical supply chain plays a crucial role in ensuring the timely and safe delivery of medicines to patients worldwide. However, the increasing presence of counterfeit drugs within this supply chain poses a significant and growing risk to public health, patient safety, and the integrity of the pharmaceutical industry. Counterfeit drugs—medications that are fraudulently manufactured, mislabeled, or contain incorrect or harmful ingredients—are a major concern as they can lead to ineffective treatments, adverse health effects, and even death. Despite stringent regulatory frameworks and advanced technological solutions, counterfeit drugs continue to infiltrate legitimate supply chains due to factors such as the complexity of the distribution system, global trade practices, and inadequate enforcement in certain regions. This manuscript explores the primary causes behind the proliferation of counterfeit drugs in pharmaceutical distribution, the associated risks, and the multifaceted approaches required to address this growing threat. It discusses the importance of regulatory measures, including international cooperation and stronger compliance frameworks, as well as the role of emerging technologies like serialization, blockchain, and RFID in ensuring traceability and product authenticity. By focusing on the integration of these technologies, the paper also highlights the potential of innovative solutions to enhance transparency, reduce vulnerabilities, and protect the integrity of pharmaceutical supply chains. Additionally, it emphasizes the importance of public awareness campaigns and collaboration between key stakeholders, including pharmaceutical manufacturers, distributors, regulators, and healthcare providers, in creating a more secure and trustworthy pharmaceutical distribution ecosystem. Through a comprehensive exploration of these strategies, this manuscript aims to provide a roadmap for mitigating the risks posed by counterfeit drugs and ensuring the safety and efficacy of medicines for consumers worldwide.
Review Article
Open Access January 10, 2022

Composable Infrastructure: Towards Dynamic Resource Allocation in Multi-Cloud Environments

Abstract To ensure maximum flexibility, service providers offer a variety of computing options with regard to CPU, memory capacity, and network bandwidth. At the same time, the efficient operation of current cloud applications requires an infrastructure that can adjust its configuration continuously across multiple dimensions, which are generally not statically predefined. Our research shows that these [...] Read more.
To ensure maximum flexibility, service providers offer a variety of computing options with regard to CPU, memory capacity, and network bandwidth. At the same time, the efficient operation of current cloud applications requires an infrastructure that can adjust its configuration continuously across multiple dimensions, which are generally not statically predefined. Our research shows that these requirements are hardly met with today's typical public cloud and management approaches. To provide such a highly dynamic and flexible execution environment, we propose the application-driven autonomic management of data center resources as the core vision for the development of a future cloud infrastructure. As part of this vision and the required gradual progress toward it, we present the concept of composable infrastructure and its impact on resource allocation for multi-cloud environments. We introduce relevant techniques for optimizing resource allocation strategies and indicate future research opportunities [1]. Many cloud service providers offer computing instances that can be configured with arbitrary capacity, depending on the availability of certain hardware resources. This level of configurability provides customers with the desired flexibility for executing their applications. Because of the large number of such prerequisite instances with often varying characteristics, service consumers must invest considerable effort to set up or reconfigure elaborate resource provisioning systems. Most importantly, they must differentiate the loads to be distributed between jobs that need to be executed versus placeholder jobs, i.e., jobs that trigger the automatic elasticity functionality responsible for resource allocator reconfiguration. Operations research reveals that the optimization of resource allocator reconfiguration strategies is a fundamentally difficult problem due to its NP-hardness. Despite these challenges, dynamic resource allocation in multi-clouds is becoming increasingly important since modern Internet-based service settings are dispersed across multiple providers [2].
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Predictive Analytics in Biologics: Improving Production Outcomes Using Big Data

Abstract Biopharmaceuticals, or biologics, are a burgeoning sector in the pharmaceutical industry, predicted to reach $239.4 billion by 2025. This unparalleled growth is often attributed to the enhanced specificity offered by large molecules over small molecules. The large size of the constituent proteins necessitates the continuous implementation of big data predictive analytics to elucidate the most [...] Read more.
Biopharmaceuticals, or biologics, are a burgeoning sector in the pharmaceutical industry, predicted to reach $239.4 billion by 2025. This unparalleled growth is often attributed to the enhanced specificity offered by large molecules over small molecules. The large size of the constituent proteins necessitates the continuous implementation of big data predictive analytics to elucidate the most effective candidates in the lead optimization process. These same methodologies can be applied, and with the advent of machine learning and automated predictive analytics, this is becoming an increasingly facile task, to the augmentation and optimization of the downstream production processes that comprise the majority of the development cost of any biologic. In this work, big data from cell line generation, product and process design, and large-scale lead validation studies have been used to compare the applicability of simple statistical models against these black-box approaches for the rapid acceleration of enzymes to the pilot plant stage. This research can be expanded upon to exploit the big datasets generated as part of the progression of biologics through the development pipeline to further optimize production outcomes. Over the coming months, data from the project will be used to probe which approaches are amenable to which processes and, as a result, more amenable to various economic simulations. The computed optimization objective for the HIT must include the cost of acquiring, storing, and analyzing data to construct these predictive models, alongside the expected commercial reward of choosing an optimally ranked candidate. In this vein, perspective must be taken in the probable future price, capability outputs, and ownership issues of increasingly sophisticated data analysis software as superstructures become more frequent. It is frequently stated that decisions made to reduce production costs are data-driven, but that is not because more economically or energetically costly experiments or production methods are employed; to truly evaluate production steps, dynamic energy, and economic models need to become more commonplace. Conversion of process quality approaches from large questionnaires, risk analysis, and expert opinion-driven methods to statistical and thus more reliable approaches is an area of future research in analytics used herein.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Data Engineering Frameworks for Optimizing Community Health Surveillance Systems

Abstract A Changing World Demands Optimized Health Surveillance Systems – and How Data Engineering Can Help There is a growing urgency to manage the public health and emergency response practices effectively today, in light of complex and emerging health threats. Fortunately, a host of new tools, including big and streaming data sources, methods such as machine learning, new types of hardware like [...] Read more.
A Changing World Demands Optimized Health Surveillance Systems – and How Data Engineering Can Help There is a growing urgency to manage the public health and emergency response practices effectively today, in light of complex and emerging health threats. Fortunately, a host of new tools, including big and streaming data sources, methods such as machine learning, new types of hardware like blockchain or secure enclaves, and means of data storage and retrieval, have emerged. But, with these innovations comes a grand challenge: how to blend with, and adapt them to, the traditional public health practices. The long-in-place infrastructures and protocols to protect and ensure the welfare of communities are in need of change, or at least update, to enhance their marked longevity of impact directly on the health outcomes and community wellbeing they were designed to fortify. It is in this vein that the essay is written and composed. The investigation in this essay is to query what, particularly, might be the aspects and influences of the emerging veritable cornucopia of new data engineering frameworks that are either being developed specifically for health surveillance and wellness, or are available to be co opted from devices and services already thriving in the current market and research milieu. Knowing what these ways may be could well aid in molding their uptake and spread, ensuring their beneficial impacts on those communities who stand to gain the most. The essay is divided into several key segments. After this introduction, section two details the research methods. In the section that follows, the maximum health outcome potentials of these novel frameworks are reviewed. Part four of the essay takes a more critical approach, addressing how the success of these methods may be hindered and future research avenues. Lastly, the concluding information suggests some actions to take to aid best suit the implementation of these ways, and suggests some thoughts for further research after the completion of these inquiriestrand [1].
Figures
PreviousNext
Case Report
Open Access December 27, 2021

Advanced Computational Technologies in Vehicle Production, Digital Connectivity, and Sustainable Transportation: Innovations in Intelligent Systems, Eco-Friendly Manufacturing, and Financial Optimization

Abstract This paper includes the impacts of the Internet of Things (IoT), Big Data, and other emerging technologies in the vehicle production sector, digital connectivity, and sustainable transport system. Automated and intelligent transportation for safe, efficient, and sustainable transport systems will be stressed. Key factors to promote automated or connected vehicles including connected environment, [...] Read more.
This paper includes the impacts of the Internet of Things (IoT), Big Data, and other emerging technologies in the vehicle production sector, digital connectivity, and sustainable transport system. Automated and intelligent transportation for safe, efficient, and sustainable transport systems will be stressed. Key factors to promote automated or connected vehicles including connected environment, integration of all transport modes, advanced cooperative systems, and policy enforcement will be discussed. This paper contains the Axiomatic Categorisation Framework (AFS) for the dynamic alignment in a collection of disparate functions in cyber-physical systems (CPS). Developed system is enhanced for breaking the rules within autonomous vehicles (AV). It means the human personal injury is inevitable while the vehicle does not do any rules. Especially in complicated traffic situations, many of the constraints are mutually exclusive, and there is no way to obey all of them at a time. Also, there is no way to ensure that the self-driving vehicle has priority in all situations [1]. Public distrust in AV systems has to be increased and the investment in this technology has to slow down. Instead, a human driver should be partially responsible for operation. The development of a driver-behavior assistant (DBA) system is proposed, which should be able to break the rules for the distances of such slow development. It is intended to be effective in non-deterministic situations while maintaining the safety of the AV and those involved in the event. A driver's actions would not only be acceptable as a driving strategy but also would be predictable, and therefore other road users could unambiguously react.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Revolutionizing Risk Assessment and Financial Ecosystems with Smart Automation, Secure Digital Solutions, and Advanced Analytical Frameworks

Abstract For years, risk assessment and financial calculations have been based on mathematical, statistical, and actuarial studies of existing and historical data. The manual process of building datasets, processing data, deriving trends, identifying periodicities, and analyzing diagnostics is extremely expensive and time-consuming. With the automation and evolution of data science technologies, [...] Read more.
For years, risk assessment and financial calculations have been based on mathematical, statistical, and actuarial studies of existing and historical data. The manual process of building datasets, processing data, deriving trends, identifying periodicities, and analyzing diagnostics is extremely expensive and time-consuming. With the automation and evolution of data science technologies, organizations are now bringing in niche data, such as unstructured data, which contain more disruptive and precise signals for decision-making—thereby making predictions and derivative valuations more robust. This discussion highlights how investment decision-making and financial ecosystem activities are set to be transformed with the power of technical automation, data, and artificial intelligence. A noted trend in the financial investment sector is that financial valuations are highly predictive and highly non-linear in long-term occurrences. To understand these robust evolving signals and execute profitable strategies upon them, the investment management process needs to be very dynamic, open, smart, and technically deep. However, with current manual processes, reaching a high-end asset prediction still seems like a shot in the dark. In parallel, open and democratically developed financial ecosystems query relatively riskless premium opportunities in high-finance valuation and perception. The process of evolving financial ecosystems or the use of automated tools and data to move to unique frontiers could make high-yield profiting opportunities very safe and entirely riskless. Financial economic theories and realistic approximation models support this.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Designing Self-Learning Agentic Systems for Dynamic Retail Supply Networks

Abstract The evolution of supply chains (SC) from a linear to a network structure created an opportunity for new processes, product/service offerings, and provider-business. Rising customer service expectations have led to the need for innovative SC designs to develop and sustain competitive performance globally. Firms are forced to respond and adapt accordingly, thereby leading to design, network, [...] Read more.
The evolution of supply chains (SC) from a linear to a network structure created an opportunity for new processes, product/service offerings, and provider-business. Rising customer service expectations have led to the need for innovative SC designs to develop and sustain competitive performance globally. Firms are forced to respond and adapt accordingly, thereby leading to design, network, operational, and performance dynamics. Traditionally, SCs are treated as static structures, focusing solely on design and/or operational optimization. Such perspectives are not viable options for SC domains, as they address only a portion of the dynamic problem space, use a deterministic assumption of dominant design variables, capitalize on past data to predict future decisions, and offer pre-classified forecasting options complemented with a limited comprehension of systemic SC elasticity. Novel self-learning agentic systems are proposed that blend the sciencematics of SC decisions and dynamics. The designs guide firms seeking to build adaptive SCs using operational decision processes. The designs address the agentic nature of SC, embedding computational interaction models of firm SC networks. The designs contrast the stochastic action-taking and thereby the performance outcomes, discovering opportunities for adaptive operational designs of SC tasks. Fine-tuning and meta-learning are new design capabilities that adapt to evolving dynamic environments. Frameworks for behavioral customization and systematic exploration of the design space are provided as user guides. Exemplar designs are also provided to serve as a translation template for users to express operational models of their own contexts. To account for the dynamics of supply chains (SC), agent-based models are increasingly adopted. Such models exhibit SC structure and/or formulation dynamics. Though existing efforts commence adjacent-only structural changes, dynamism with respect to tasks is crucial for SC design and operational strategy development. Proposed is a process modeling library and workflow for discovering intricate designs of adaptive agentic systems. The library revises Dataflow and Structure, concealing sequencing and context designs of processes. Prompted specifications describe and enact designs. Applications in SC formulation discovery are provided.
Figures
PreviousNext
Review Article
Open Access December 29, 2020

Enhancing Government Fiscal Impact Analysis with Integrated Big Data and Cloud-Based Analytics Platforms

Abstract While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that [...] Read more.
While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that facilitates data retrieval and analytics, as well as policy modelling, creation and optimization. The environment enables data collection from heterogeneous sources, linking and aggregation, complemented with data cleaning and interoperability techniques. An innovative approach for analytics as a service is introduced and linked with a policy development toolkit, which is an integrated web-based environment to fulfil the requirements of the public policy ecosystem stakeholders [1]. Large information databases on various public issues exist, but their usage for public policy formulation and impact analysis has been limited so far, as no cloud-based service ecosystem exists to facilitate their efficient exploitation. With the increasing availability and importance of both public big and traditional data, the need to extract, link and utilize such information efficiently has arisen. Current data-driven web technologies and models are not aligned with the needs of this domain, and therefore, potential candidates for big data, cloud-based and service-oriented public policy analysis solutions should be investigated, piloted and demonstrated [2]. This paper presents the conceptual architecture of such an ecosystem based on the capabilities of state-of-the-art cloud and web technologies, as well as the requirements of its users.
Figures
PreviousNext
Review Article
Open Access December 21, 2021

Optimizing Data Warehousing for Large Scale Policy Management Using Advanced ETL Frameworks

Abstract Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the [...] Read more.
Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the need for data warehousing. Next, an overview of an ETL framework is presented, along with a discussion of advanced ETL techniques. The chapter concludes with an outline of performance optimization techniques for data warehousing. Data warehousing is considered a key enabler for efficient reporting and analysis, with implementation choices ranging from cost-effective desktop systems to large-scale, mission-critical data marts and warehouses containing petabytes of data. Extract, transform, and load (ETL) systems remain one of the largest cost and effort areas within data warehouse development projects, requiring significant planning and resources to build, manage, and monitor the flow of data from source systems into the data warehouse. The technology and techniques used for ETL can greatly influence the success or failure of a data warehouse. Complex business requirements for data cleansing, loading, transformation, and integration have intensified, while operational plans for real-time and near-real-time reporting add additional challenges. Parallel loading mechanisms, incremental data loading, and runtime update and insert strategies not only improve ETL performance but also optimize data warehousing performance, particularly for large-scale policy management.
Figures
PreviousNext
Article
Open Access December 22, 2020

Cloud Migration Strategies for High-Volume Financial Messaging Systems

Abstract Key business objectives for digital infrastructure cloud adoption are often framed in terms of reducing cost, improving fault tolerance and resilience, simplifying scale, and enabling innovation. Given the critical nature of the financial sector, however, where timeliness and price can significantly determine an outcome, cloud migration in delivery environments demands greater throughput on the [...] Read more.
Key business objectives for digital infrastructure cloud adoption are often framed in terms of reducing cost, improving fault tolerance and resilience, simplifying scale, and enabling innovation. Given the critical nature of the financial sector, however, where timeliness and price can significantly determine an outcome, cloud migration in delivery environments demands greater throughput on the critical path and, in many enterprise-scale settings, forgoes hybrid complexity and multi-cloud risks. Nevertheless, slack in system designs does exist; financial institutions enable market functionality—trading, clearing/best execution—despite potentially being able to meet such sets with lower service levels than other verticals. A cloud multi-account structure for sensitive data, for example, naturally limits exposure when combined with observed risk. Fulfilling predictions of elasticity during periods of high demand usually requires support from a dedicated environment (or environments) located nearer to the operations. Components can consequently be allocated on a per-account basis or maintained as shared sink systems to which the dedicated streams write. The automation code can similarly be targeted for dedicated accounts, avoiding the resource constraints that beset such operations during industry events like emergency triage/contact desking.
Figures
PreviousNext
Review Article
Open Access July 20, 2021

Quality of Experience (QoE) and Network Performance Modelling for Multimedia Traffic

Abstract This research explores the complex relationship between user-perceived Quality of Experience (QoE) and underlying network performance for multimedia traffic. As video streaming, online gaming, and interactive media dominate modern networks, ensuring consistent QoE has become a key challenge. The study develops a network performance model that integrates objective Quality of Service (QoS) [...] Read more.
This research explores the complex relationship between user-perceived Quality of Experience (QoE) and underlying network performance for multimedia traffic. As video streaming, online gaming, and interactive media dominate modern networks, ensuring consistent QoE has become a key challenge. The study develops a network performance model that integrates objective Quality of Service (QoS) parameters—such as delay, jitter, packet loss, and throughput—with subjective QoE metrics like Mean Opinion Score (MOS) and perceptual quality indices. Using simulation-based and analytical approaches, the paper evaluates how network conditions affect multimedia traffic behavior and user satisfaction. The results highlight critical thresholds for QoE degradation, enabling predictive modeling for adaptive multimedia delivery and real-time optimization. This work contributes to designing intelligent, user-centered network management systems capable of balancing resource efficiency and end-user satisfaction.
Figures
PreviousNext
Review Article
Open Access June 28, 2016

Scalable Task Scheduling in Cloud Computing Environments Using Swarm Intelligence-Based Optimization Algorithms

Abstract Effective task scheduling in cloud computing is crucial for optimizing system performance and resource utilization. Traditional scheduling methods often struggle to adapt to the dynamic and complex nature of cloud environments, where workloads, resource availability, and task requirements constantly change. Swarm intelligence-based optimization algorithms, such as Particle Swarm Optimization [...] Read more.
Effective task scheduling in cloud computing is crucial for optimizing system performance and resource utilization. Traditional scheduling methods often struggle to adapt to the dynamic and complex nature of cloud environments, where workloads, resource availability, and task requirements constantly change. Swarm intelligence-based optimization algorithms, such as Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), and Artificial Bee Colony (ABC), offer a promising solution by mimicking natural processes to explore large search spaces efficiently. These algorithms are effective in balancing multiple objectives, including minimizing execution time, reducing energy consumption, and ensuring fairness in resource allocation. They also enhance system scalability, which is vital for modern cloud infrastructures. However, challenges remain, including slow convergence speeds, complex parameter tuning, and integration with existing cloud frameworks. Addressing these issues will be essential for the practical implementation of swarm intelligence in cloud task scheduling, helping to improve resource management and overall system performance.
Figures
PreviousNext
Review Article

Query parameters

Keyword:  Optimization

View options

Citations of

Views of

Downloads of