Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access February 06, 2026

Predictive Modeling of Public Sentiment Using Social Media Data and Natural Language Processing Techniques

Abstract Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled [...] Read more.
Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled tweets, and develops predictive models for binary sentiment classification using Naive Bayes, Logistic Regression, and the transformer-based BERT model. Experiments were conducted on a balanced subset of 12,000 tweets after comprehensive NLP preprocessing. Evaluation using accuracy, F1-score, and confusion matrices revealed that BERT significantly outperforms traditional models, achieving an accuracy of 89.5% and an F1-score of 0.89 by effectively modeling contextual and semantic nuances. In contrast, Naive Bayes and Logistic Regression demonstrated reasonable but consistently lower performance. To support practical deployment, we introduce SentiFeel, an interactive tool enabling real-time sentiment analysis. While resource constraints limited the dataset size and training epochs, future work will explore full corpus utilization and the inclusion of neutral sentiment classes. These findings underscore the potential of transformer models for enhanced public opinion monitoring, marketing analytics, and policy forecasting.
Figures
PreviousNext
Article
Open Access January 11, 2025

Exploring LiDAR Applications for Urban Feature Detection: Leveraging AI for Enhanced Feature Extraction from LiDAR Data

Abstract The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is [...] Read more.
The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is crucial for enhancing urban development, environmental monitoring, and advancing smart city governance. LiDAR, known for its high-resolution 3D data capture capabilities, paired with AI, particularly deep learning algorithms, facilitates advanced analysis and interpretation of urban areas. This combination supports precise mapping, real-time monitoring, and predictive modeling of urban growth and infrastructure. For instance, AI can process LiDAR data to identify patterns and anomalies, aiding in traffic management, environmental oversight, and infrastructure maintenance. These advancements not only improve urban living conditions but also contribute to sustainable development by optimizing resource use and reducing environmental impacts. Furthermore, AI-enhanced LiDAR is pivotal in advancing autonomous navigation and sophisticated spatial analysis, marking a significant step forward in urban management and evaluation. The reviewed paper highlights the geometric properties of LiDAR data, derived from spatial point positioning, and underscores the effectiveness of machine learning algorithms in object extraction from point clouds. The study also covers concepts related to LiDAR imaging, feature selection methods, and the identification of outliers in LiDAR point clouds. Findings demonstrate that AI algorithms, especially deep learning models, excel in analyzing high-resolution 3D LiDAR data for accurate urban feature identification and classification. These models leverage extensive datasets to detect patterns and anomalies, improving the detection of buildings, roads, vegetation, and other elements. Automating feature extraction with AI minimizes the need for manual analysis, thereby enhancing urban planning and management efficiency. Additionally, AI methods continually improve with more data, leading to increasingly precise feature detection. The results indicate that the pulse emitted by continuous wave LiDAR sensors changes when encountering obstacles, causing discrepancies in measured physical parameters.
Figures
PreviousNext
Article
Open Access January 10, 2025

Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence

Abstract Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a [...] Read more.
Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS's transformative potential across diverse computational fields.
Figures
PreviousNext
Article
Open Access February 17, 2024

An Overview of Short- and Long-Term Adverse Outcomes and Complications of Perinatal Depression on Mother and Offspring

Abstract Antenatal and postpartum major depressive episode (MDE) according to Diagnostic and Statistical Manual of Mental Disorders 5th Edition (DSM-V) is defined as either daily sustained sad mood or lack of enjoyment or desire for a minimum two weeks plus four associated manifestations (only three if the two major symptoms are present) that start throughout pregnancy or during the first 4 weeks [...] Read more.
Antenatal and postpartum major depressive episode (MDE) according to Diagnostic and Statistical Manual of Mental Disorders 5th Edition (DSM-V) is defined as either daily sustained sad mood or lack of enjoyment or desire for a minimum two weeks plus four associated manifestations (only three if the two major symptoms are present) that start throughout pregnancy or during the first 4 weeks postpartum respectively: 1) Unintentional notable slimming up or down; 2) Sleepiness or sleeplessness; 3) Tiredness sensation; 4) Guilty or futility sensation; 5) Declined concentration capacity; 6) Frequent suicidal thoughts; 7) Psychomotor excitation or delay. Perinatal depression carries vital and adverse consequences on mother’s psychosocial aspects of life, pregnancy and delivery outcomes, her interrelations specifically with the new born with poorer overall health and influences negatively on offspring from the intrauterine life passing by complicated delivery experiencing hard unstable childhood reaching unhealthy adolescence and adulthood. These negative consequences necessitate a great attention for prevention, screening and prompt treatment for antenatal and postnatal depression to prevent such disastrous effects.
Brief Review
Open Access November 30, 2022

A Review of Application of LiDAR and Geospatial Modeling for Detection of Buildings Using Artificial Intelligence Approaches

Abstract Today, the presentation of a three-dimensional model of real-world features is very important and widely used and has attracted the attention of researchers in various fields, including surveying and spatial information systems, and those interested in the three-dimensional reconstruction of buildings. The building is the key part of the information in a three-dimensional city model, so extracting [...] Read more.
Today, the presentation of a three-dimensional model of real-world features is very important and widely used and has attracted the attention of researchers in various fields, including surveying and spatial information systems, and those interested in the three-dimensional reconstruction of buildings. The building is the key part of the information in a three-dimensional city model, so extracting and modeling buildings from remote sensing data is an important step in building a digital model of a city. LiDAR technology due to its ability to map in all three modes of one-dimensional, two-dimensional, and three-dimensional is a suitable solution to provide hyperspectral and comprehensive images of the building in an urban environment. In this review article, a comprehensive review of the methods used in identifying buildings from the past to the present and appropriate solutions for the future is discussed.
Figures
PreviousNext
Review Article
Open Access November 29, 2022

The Application of Machine Learning in the Corona Era, With an Emphasis on Economic Concepts and Sustainable Development Goals

Abstract The aim of this article is to examine the impacts of Coronavirus Disease -19 (Covid-19) vaccines on economic condition and sustainable development goals. In other words, we are going to study the economic condition during Covid19. We have studied the economic costs of pandemic, benefits in terms of gross domestic product (GDP), public finances and employment, investment on vaccines around the [...] Read more.
The aim of this article is to examine the impacts of Coronavirus Disease -19 (Covid-19) vaccines on economic condition and sustainable development goals. In other words, we are going to study the economic condition during Covid19. We have studied the economic costs of pandemic, benefits in terms of gross domestic product (GDP), public finances and employment, investment on vaccines around the world, progress and totally the economic impacts of vaccines and the impacts of emerging markets (EM) on achieving sustainable development goals (SDGs), including no poverty, good health and well-being, zero hunger, reduced inequality etc. The importance of emerging economies in reducing the harmful effects of the Corona has also been noted. We have tried to do experimental results and forecast daily new death cases from Feb-2020 to Aug-2021 in Iran using Artificial Neural Network (ANN) and Beetle Antennae Search (BAS) algorithm as a case study with econometric models and regression analysis. The findings show that Covid19 has had devastating economic and health effects on the world, and the vaccine can be very helpful in eliminating these effects specially in long-term. We observed that there is inequality in the distribution of Corona vaccines in rich countries compared to poor which EM can decrease the gap between them. The results show that both models (i.e., Artificial intelligence (AI) and econometric models) almost have the same results but AI optimization models can robust the model and prediction. The main contribution of this article is that we have surveyed the impacts of vaccination from socio-economic viewpoint not just report some facts and truth. We have surveyed the impacts of vaccines on sustainable development goals and the role of EM in achieving SDGs. In addition to using the theoretical framework, we have also used quantitative and empirical results that have rarely been seen in other articles.
Figures
PreviousNext
Article
Open Access June 28, 2025

Development of a Hemodialysis Data Collection and Clinical Information System and Establishment of an Intradialytic Blood Pressure/Pulse Rate Predictive Model

Abstract This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models [...] Read more.
This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models, with subsequent suggestions provided. Both objectives were executed under the supervision of the Institutional Review Board (IRB) at Mackay Memorial Hospital in Taiwan. The system completed for objective one has introduced three significant services to the clinic, including automated hemodialysis data collection, digitized data storage, and an information-rich human-machine interface as well as graphical data displays, which replaces traditional paper-based clinical administrative operations, thereby enhancing healthcare efficiency. The graphical data presented through web and app interfaces aids in real-time, intuitive comprehension of the patients’ conditions during hemodialysis. Moreover, the data stored in the backend database is available for physicians to conduct relevant analyses, unearth insights into medical practices, and provide precise medical care for individual patients. The training and evaluation of the predictive models for objective two, along with related comparisons, analyses, and recommendations, suggest that in situations with limited computational resources and data, an Artificial Neural Network (ANN) model with six hidden layers, SELU activation function, and a focus on artery-related features can be employed for hourly intradialytic BP/PR prediction tasks. It is believed that this contributes to the collaborating clinic and relevant research communities.
Figures
Figure 3 (c)
Figure 3 (d)
Figure 4 (b)
Figure 4 (c)
Figure 4 (d)
Figure 4 (e)
Figure 4 (f)
Figure 4 (g)
Figure 4 (h)
Figure 5 (b)
Figure 6 (b)
Figure 6 (c)
Figure 6 (d)
Figure 6 (e)
Figure 6 (f)
Figure 7 (b)
Figure 7 (c)
Figure 7 (d)
Figure 7 (e)
Figure 7 (f)
Figure 7 (g)
Figure 8 (b)
Figure 8 (c)
Figure 8 (d)
Figure 9 (b)
Figure 9 (c)
Figure 9 (d)
Figure 10 (b)
Figure 10 (c)
Figure 10 (d)
Figure 10 (e)
Figure 10 (f)
Figure 11 (b)
Figure 11 (c)
Figure 11 (d)
Figure 11 (e)
PreviousNext
PDF Html Xml
Article
Open Access April 10, 2025

Advancements in Pharmaceutical IT: Transforming the Industry with ERP Systems

Abstract The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data [...] Read more.
The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data integration, contributing significantly to operational efficiency and organizational agility. This paper explores the evolution and impact of ERP systems within the pharmaceutical sector, highlighting their contributions to overcoming the industry’s inherent challenges, including complex regulatory requirements, the need for accurate and real-time data, and the demand for supply chain resilience. The integration of cloud-based ERP solutions, the incorporation of emerging technologies like Artificial Intelligence (AI), Machine Learning (ML), and the Internet of Things (IoT), and enhanced data analytics capabilities have revolutionized pharmaceutical IT. These advancements not only reduce operational costs, improve forecasting accuracy, and enhance collaboration but also ensure compliance with stringent global regulations, such as Good Manufacturing Practices (GMP) and FDA guidelines. Moreover, ERP systems have been instrumental in managing the pharmaceutical supply chain, ensuring product traceability, and improving inventory control and order fulfillment processes. This manuscript examines how ERP systems enable pharmaceutical companies to maintain high standards of product quality, improve decision-making, and ensure the safety and efficacy of drugs through robust tracking and auditing mechanisms. A case study of a pharmaceutical company that implemented an ERP system demonstrates the tangible benefits, including increased operational efficiency, improved compliance rates, and enhanced customer satisfaction. However, despite the clear advantages, challenges such as customization complexities, data integration issues, and resistance to change remain. As the pharmaceutical industry continues to evolve, ERP systems will remain a cornerstone of digital transformation, facilitating smarter decision-making, better resource management, and enhanced collaboration across global operations. This paper also identifies future trends, including the potential of AI and blockchain technologies in further strengthening ERP systems and transforming the pharmaceutical landscape.
Review Article
Open Access March 22, 2025

Enhancing Scalability and Performance in Analytics Data Acquisition through Spark Parallelism

Abstract Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale [...] Read more.
Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale REST API calls, enabling enhanced scalability and improved processing speeds to meet the demands of high volume data workflows.
Figures
PreviousNext
Review Article
Open Access February 26, 2025

Innovations and Challenges in Pharmaceutical Supply Chain, Serialization and Regulatory Landscape

Abstract The pharmaceutical supply chain has become increasingly complex and vulnerable to various risks, including counterfeit drugs, diversion, and fraud. As these challenges threaten patient safety and the integrity of global healthcare systems, serialization has emerged as a pivotal innovation in pharmaceutical logistics and regulatory compliance. Serialization involves assigning unique identifiers to [...] Read more.
The pharmaceutical supply chain has become increasingly complex and vulnerable to various risks, including counterfeit drugs, diversion, and fraud. As these challenges threaten patient safety and the integrity of global healthcare systems, serialization has emerged as a pivotal innovation in pharmaceutical logistics and regulatory compliance. Serialization involves assigning unique identifiers to individual drug packages, enabling precise tracking and authentication at every stage of the supply chain. This process provides unprecedented transparency, enhances product security, and facilitates real-time monitoring of pharmaceutical products as they move from manufacturers to end consumers. Despite its potential to revolutionize pharmaceutical traceability, the integration of serialization technologies faces numerous obstacles. These include high implementation costs, regulatory inconsistencies across regions, and the technological challenges of managing vast amounts of data. Moreover, the complex, multi-tiered nature of the global supply chain introduces additional risks related to data integrity, cybersecurity, and interoperability between systems. As pharmaceutical companies seek to navigate these challenges, innovations in serialization technology—such as blockchain, artificial intelligence (AI), the Internet of Things (IoT), and radio frequency identification (RFID)—are providing promising solutions to enhance efficiency, reduce fraud, and increase visibility. This manuscript explores both the innovative advancements and the key challenges associated with the integration of serialization in the pharmaceutical supply chain. It delves into the evolving regulatory landscape, highlighting the need for global harmonization of serialization standards, and examines the impact of serialization on securing pharmaceutical distribution networks. Additionally, the paper emphasizes the importance of collaboration among manufacturers, technology providers, and regulatory bodies in overcoming implementation barriers and realizing the full potential of serialization. As the pharmaceutical industry moves towards a more interconnected and data-driven future, serialization promises to play a central role in shaping the next generation of drug safety and supply chain management. By addressing the hurdles to adoption and leveraging emerging technologies, the pharmaceutical sector can create a more secure, transparent, and efficient supply chain that better serves public health and fosters greater trust among consumers and healthcare professionals alike.
Review Article
Open Access February 09, 2025

The Future of Longevity Medicine from the Lens of Digital Therapeutics

Abstract Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements [...] Read more.
Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements in DTx is the integration of artificial intelligence (AI) and machine learning (ML) to tailor interventions based on individual health data. This personalization enhances the effectiveness of treatments and supports preventive care by identifying risk factors early. The need for digital therapeutics is underscored by the rising prevalence of NCDs, which are responsible for a significant portion of global mortality and healthcare costs. Traditional healthcare systems often struggle to provide timely and personalized care, especially in low-resource settings. DTx can bridge this gap by offering cost-effective solutions that are easily scalable. Moreover, digital therapeutics can address health inequities by providing low-cost interventions to underserved populations, thereby reducing the burden of NCDs and improving overall health outcomes. As technology continues to evolve, the potential for DTx to enhance longevity and quality of life becomes increasingly promising. Recent advancements in longevity medicine and technology have focused on extending both lifespan and healthspan, ensuring that people not only live longer but also maintain good health throughout their extended years. This review article highlights these advancements that are contributing to this compelling subject of Longevity.
Figures
PreviousNext
Review Article
Open Access January 22, 2025

Tech Transformations: Modern Solutions for Obstructive Sleep Apnea

Abstract Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in [...] Read more.
Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in patients with a high pre-test probability of OSA. In terms of diagnosis, advancements in wearable technology and mobile health applications have enabled continuous monitoring of sleep patterns and respiratory parameters. These tools provide valuable data that can be used to identify OSA more accurately and promptly. Additionally, machine learning algorithms are being integrated into diagnostic processes to enhance the accuracy of OSA detection by analyzing large datasets and identifying patterns indicative of the condition. Management of OSA has also seen significant progress. Continuous positive airway pressure (CPAP) therapy remains the gold standard, but new developments include auto-adjusting CPAP devices that optimize pressure settings based on real-time feedback. Mandibular advancement devices and hypoglossal nerve stimulation are emerging as effective alternatives for patients who are CPAP-intolerant. Furthermore, lifestyle interventions such as weight management, positional therapy, and exercise have been shown to complement medical treatments, leading to better overall outcomes. This review article highlights these advancements that collectively contribute to improved patient adherence, reduced symptoms, and enhanced quality of life for individuals with OSA.
Review Article
Open Access November 16, 2024

Digital Therapeutics: A New Dimension to Diabetes Mellitus Management

Abstract Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle [...] Read more.
Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabetes control. The importance of DTx lies in their ability to make diabetes care more accessible and convenient. Mobile apps and telemedicine platforms enable patients to receive support and guidance from anywhere, reducing the need for frequent in-person visits. Additionally, DTx often include behavioral support features like reminders, educational content, and motivational tools, which are crucial for maintaining healthy habits and managing stress. Currently, the dynamics of DTx in diabetes are rapidly evolving, with increasing integration of artificial intelligence and machine learning to further personalize and optimize care. As the adoption of these technologies grows, they hold the potential to significantly improve patient outcomes and revolutionize diabetes management on a global scale. This article will focus on the benefits of novel digital therapeutics for prevention and management of type II diabetes that are currently available in the market.
Figures
PreviousNext
Article
Open Access July 21, 2024

Securing Pharmaceutical Supply chain to Combat Active Pharmaceutical Ingredient Counterfeiting

Abstract Pharmaceutical Product serialization aims to assign distinct serial numbers to items within a pharmaceutical supply chain. However, this process faces several security challenges like Theft of valid serial numbers may occur, enabling the labelling of counterfeit products. Therefore, it's essential to ensure the uniqueness of serial numbers can be verified at any point in the product's lifecycle [...] Read more.
Pharmaceutical Product serialization aims to assign distinct serial numbers to items within a pharmaceutical supply chain. However, this process faces several security challenges like Theft of valid serial numbers may occur, enabling the labelling of counterfeit products. Therefore, it's essential to ensure the uniqueness of serial numbers can be verified at any point in the product's lifecycle within the supply chain. Intimidatory nodes along the distribution network could corrupt planned changes of custody for products. Ensuring verifiability of compliance with these changes is crucial. Manufacturers and consumers need assurance that perishable goods with expired shelf lives are appropriately discarded. In this paper, we review a product serialization method leveraging blockchain technology to address these security concerns within a multi-party perishable goods supply chain. Blockchains offer potential solutions by providing a secure platform for data sharing in multi-party environments, enhancing security and transparency. Within Blockchain technology, each distribution partner is registered to uphold transparency regarding drug information. The system facilitates real-time transfer of ownership changes, recording them as blocks with date and time stamps. This ensures visibility to all partners in real time, maintaining the authenticity of drugs. This article aims to outline how Blockchain technology benefits the pharmaceutical industry by enhancing traceability and trackability of drugs throughout the entire pharmaceutical supply chain.
Review Article
Open Access June 28, 2024

Nigeria Exchange Rate Volatility: A Comparative Study of Recurrent Neural Network LSTM and Exponential Generalized Autoregressive Conditional Heteroskedasticity Models

Abstract Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long [...] Read more.
Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long short term memory (LSTM) model with the combinations of p = 10 and q = 1 layers to model the volatility of Nigerian exchange rates. Our goal is to determine the preferred model for predicting Nigeria’s Naira exchange rate volatility with Euro, Pounds and US Dollars. The dataset of monthly exchange rates of the Nigerian Naira to US dollar, Euro and Pound Sterling for the period December 2001 – August 2023 was extracted from the Central Bank of Nigeria Statistical Bulletin. The model efficiency and performance was measured with the Mean Squared Error (MSE) criteria. The results indicated that the Nigeria exchange rate volatility is asymmetric, and leverage effects are evident in the results of the EGARCH (1, 1) model. It was observed also that there is a steady increase in the Nigeria Naira exchange rate with the euro, pounds sterling and US dollar from 2016 to its highest peak in 2023. Result of the comparative analysis indicated that, EGARCH (1,1) performed better than the LSTM model because it provided a smaller MSE values of 224.7, 231.3 and 138.5 for euros, pounds sterling and US Dollars respectively.
Figures
PreviousNext
Article
Open Access May 14, 2024

A review of reliability techniques for the evaluation of Programmable logic controller

Abstract PLCs, or programmable logic controllers, are essential parts of contemporary industrial automation systems and are responsible for managing and keeping an eye on a variety of operations. PLC reliability is critical to maintaining industrial systems' continuous and secure operation. A wide range of reliability strategies were used to improve the reliability of Programmable Logic Controllers, and [...] Read more.
PLCs, or programmable logic controllers, are essential parts of contemporary industrial automation systems and are responsible for managing and keeping an eye on a variety of operations. PLC reliability is critical to maintaining industrial systems' continuous and secure operation. A wide range of reliability strategies were used to improve the reliability of Programmable Logic Controllers, and this article methodically looks at them all. The evaluation classified PLC reliability techniques into Root Cause Analysis (RCA), Reliability Centered Maintenance (RCM), Hazard analysis (HA), Reliability block diagram (RBD), Fault tree analysis (FTA), Physics of failure (PoF) and FMEA/FMECA, after thoroughly reviewing the body of literature. The proportion of reviewed papers using either RCA, RCM, FMEA/FMECA, FTA, RBD, RCM, PoF, or Hazard analysis to increase the reliability of PLCs showed that RCA, which makes up 20% of the publications reviewed, has been used the most to increase the reliability of the PLC, followed by HA, RCM, RBD, FTA, and PoF, which account for 17%, 16%, 16%,13%, 10%, and 8% of the articles reviewed, respectively. The paper discusses new developments and trends in PLC reliability, such as the application of machine learning (ML) and artificial intelligence (AI) to fault detection and predictive maintenance.
Figures
PreviousNext
Review Article
Open Access April 16, 2024

Revolutionizing Automotive Supply Chain: Enhancing Inventory Management with AI and Machine Learning

Abstract Consumer behavior is evolving, demanding a wide range of products with fast shipping and reliable service. The automotive aftermarket industry, worth billions, requires efficient distribution systems to stay competitive. Manufacturers strive to balance growth with product and service excellence. Distributors and retailers face the challenge of maintaining competitive pricing while keeping [...] Read more.
Consumer behavior is evolving, demanding a wide range of products with fast shipping and reliable service. The automotive aftermarket industry, worth billions, requires efficient distribution systems to stay competitive. Manufacturers strive to balance growth with product and service excellence. Distributors and retailers face the challenge of maintaining competitive pricing while keeping inventory levels low. An adequate supply chain and accurate product data are crucial for product availability and reducing stock issues. This ultimately increases profits and customer satisfaction.
Figures
PreviousNext
Article
Open Access November 15, 2023

Predictive Failure Analytics in Critical Automotive Applications: Enhancing Reliability and Safety through Advanced AI Techniques

Abstract Failure prediction can be achieved through prognostics, which provides timely warnings before failure. Failure prediction is crucial in an effective prognostic system, allowing preventive maintenance actions to avoid downtime. The prognostics problem involves estimating the remaining useful life (RUL) of a system or component at any given time. The RUL is defined as the time from the current time [...] Read more.
Failure prediction can be achieved through prognostics, which provides timely warnings before failure. Failure prediction is crucial in an effective prognostic system, allowing preventive maintenance actions to avoid downtime. The prognostics problem involves estimating the remaining useful life (RUL) of a system or component at any given time. The RUL is defined as the time from the current time to the time of failure. The goal is to make accurate predictions close to the failure time to provide early warnings. J S Grewal and J. Grewal provide a comprehensive definition of RUL in their paper "The Kalman Filter approach to RUL estimation." A process is a quadruple (XU f P), where X is the state space, U is the control space, P is the set of possible paths, and f represents the transition between states. The process involves applying control values to change the system's state over time.
Figures
PreviousNext
Article
Open Access April 11, 2024

5V’s of Big Data Shifted to Suite the Context of Software Code: Big Code for Big Software Projects

Abstract Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the [...] Read more.
Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the codebases of the software’s are increasingly growing that is the size of the modules, functionalities, the size of the classes etc. Since data is growing so rapidly it also mean the codebases of software’s or code are also growing as well. Therefore, this paper seeks to discuss the 5V’s of big data in the context of software code and how to optimize or manage the big code. When we talk of "Big Code for Big Software's," we are referring to the specific challenges and considerations involved in developing, managing, and maintaining of code in large-scale software systems.
Article
Open Access March 06, 2024

Embedded Architecture of SAP S/4 HANA ERP Application

Abstract The SAP HANA Application to handle operational workloads that are consistent with transactions while also supporting intricate business analytics operations. Technically speaking, the SAP HANA database is made up of several data processing engines that work together with a distributed query processing environment to provide the entire range of data processing capabilities. This includes graph and [...] Read more.
The SAP HANA Application to handle operational workloads that are consistent with transactions while also supporting intricate business analytics operations. Technically speaking, the SAP HANA database is made up of several data processing engines that work together with a distributed query processing environment to provide the entire range of data processing capabilities. This includes graph and text processing for managing semi-structured and unstructured data within the same system, as well as classical relational data that supports both row- and column-oriented physical representations in a hybrid engine. The next-generation SAP Business Suite program designed specifically for the SAP HANA Platform is called SAP S/4HANA. The key features of SAP S/4HANA are an intuitive, contemporary user interface (SAP Fiori); planning and simulation options in many conventional transactions; simplification of business processes; significantly improved transaction efficiency; faster analytics.
Review Article
Open Access February 19, 2024

The use of contemporary Enterprise Resource Planning (ERP) technologies for digital transformation

Abstract Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with [...] Read more.
Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with numerous, dispersed, and very small structures that are made possible by digitization. Utilizing the possibilities of cloud computing, mobile systems, big data and analytics, services computing, Internet of Things, collaborative networks, and decision support, numerous new business prospects have emerged throughout the years. The logical basis for robust and self-optimizing run-time environments for intelligent business services and adaptable distributed information systems with service-oriented enterprise architectures comes from biological metaphors of living, dynamic ecosystems. This has a significant effect on how digital services and products are designed from a value- and service-oriented perspective. The evolution of enterprise architectures and the shift from a closed-world modeling environment to a more flexible open-world composition establish the dynamic framework for highly distributed and adaptive systems, which are crucial for enabling the digital transformation. This study examines how enterprise architecture has changed over time, taking into account newly established, value-based relationships between digital business models, digital strategies, and enhanced enterprise architecture.
Review Article
Open Access February 18, 2024

An Appraisal of Challenges in Developing Information Literacy Skills in the Colleges of Education of Ghana

Abstract The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North [...] Read more.
The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North Region. Purposive, stratified, and convenience sampling techniques were used to select colleges of education and level 200 students. The three (3) colleges of education were stratified and purposively selected while 256 level 200 students were stratified and conveniently sampled. The study employed questionnaires to collect data from the sampled students. Questionnaires (open and closed-ended questions) focused on the challenges faced by the students in developing their Information Literacy (IL) skills. The quantitative data was captured, analysed, and presented in descriptive statistics such as percentages, and frequency tables, to determine the objective of the study. It is recommended that to improve digital literacy and academic pursuits, the college management should improve access to desktop computers and the Internet in the library and computer centre. It is also recommended that Management and librarians of the Colleges of Education ensure that students have access to these devices at the library and can use them to develop their IL skills and help them manage their references more effectively.
Article
Open Access February 17, 2024

Universal Evaluation of SAP S/4 Hana ERP Cloud System

Abstract Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of [...] Read more.
Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of achieving maximum productivity is not fully utilized. One of the causes of this reality is the underfunding of ergonomic measures and the newest technologies. Through the design of S4 Hana cloud ERP software applications, we will demonstrate how important and highly recommended ergonomic research is in order to minimize the financial and human costs that enterprises are currently facing.
Figures
PreviousNext
Review Article
Open Access February 15, 2024

Stock Closing Price and Trend Prediction with LSTM-RNN

Abstract The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use [...] Read more.
The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing price of the Amazon Inc. stock (ticker symbol: AMZN). We study the influence of various hyperparameters in the model to see what factors the predictive power of the model. The root mean squared error (RMSE) on the training was 2.51 with a mean absolute percentage error (MAPE) of 1.84%.
Figures
PreviousNext
Article
Open Access December 06, 2023

Success Factors of Adopting Cloud Enterprise Resource Planning

Abstract The technologies for cloud ERP (Enterprise Resource Planning) have revolutionized the field of information technologies. Any kind of business can benefit from their flexibility, affordability, scalability, adaptation, availability, and customizable data. An advancement of classic ERP, cloud enterprise resource planning (C-ERP) provides the benefits of cloud computing (CC), including resource [...] Read more.
The technologies for cloud ERP (Enterprise Resource Planning) have revolutionized the field of information technologies. Any kind of business can benefit from their flexibility, affordability, scalability, adaptation, availability, and customizable data. An advancement of classic ERP, cloud enterprise resource planning (C-ERP) provides the benefits of cloud computing (CC), including resource elasticity and ease of use. The rise of cloud computing affects on-premise ERP systems in terms of architecture and cost. Cloud-based ERP systems make the claim to be appropriate for digital corporate settings. System quality, security, vendor lock-in, and data accessibility are recognized as the technological issues. Industry 4.0 refers to the re-engineering and revitalization of modern factories through the integration of cloud-based operations, industrial internet connectivity, additive manufacturing, and cybersecurity platforms. One of the four main pillars of Industry 4.0, cloud-based Enterprise Resource Planning (Cloud ERP), is a component of cloud operations that aids in achieving greater standards of sustainable performance.
Figures
PreviousNext
Review Article
Open Access December 03, 2023

Evolution of Enterprise Applications through Emerging Technologies

Abstract The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various [...] Read more.
The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various industries. Grasping the concept of artificial intelligence and its application in diverse business applications is crucial, given its broad and intricate nature. The primary focus of this paper is to delve into the realm of artificial intelligence and its utilization within enterprise resource planning. The study not only explores artificial intelligence but also delves into related concepts such as machine learning, deep learning, and neural networks in greater detail. Drawing upon existing literature, this research examines various books and online resources discussing the intersection of artificial intelligence and ERP. The findings reveal that the impact of AI is evident as businesses attain heightened levels of analytical efficiency across different ERP domains, thanks to remarkable advancements in AI, machine learning, and deep learning. Artificial intelligence is extensively employed in numerous ERP areas, with a particular emphasis on customer support, predictive analysis, operational planning, and sales projections.
Review Article
Open Access February 23, 2023

Substituting Intelligence

Abstract The development of ChatGPT is a topical subject of reflection. This short paper focuses on the (possible) use of ChatGPT in academia and some of its (possible) ramifications for users’ cognitive abilities and, dramatically put, their existence.
The development of ChatGPT is a topical subject of reflection. This short paper focuses on the (possible) use of ChatGPT in academia and some of its (possible) ramifications for users’ cognitive abilities and, dramatically put, their existence.
Communication
Open Access December 14, 2022

Applying Artificial Intelligence (AI) for Mitigation Climate Change Consequences of the Natural Disasters

Abstract Climate change and weather-related disasters are speeded very fast in the last decades with the consequences bringing to humanity: insecurity, destructing the ecological systems, increasing poverty, human victims, and economical losses everywhere on the planet. The innovative methods applied to mitigate the magnitudes of natural disasters and to combat effectively their negative impact consist of [...] Read more.
Climate change and weather-related disasters are speeded very fast in the last decades with the consequences bringing to humanity: insecurity, destructing the ecological systems, increasing poverty, human victims, and economical losses everywhere on the planet. The innovative methods applied to mitigate the magnitudes of natural disasters and to combat effectively their negative impact consist of remote and earth constantly monitoring, data collection, creation of models for big data extrapolation, prediction, in-time warning for prevention, and others. Artificial intelligence (AI) is used to deal with big data, for calculations, forecasts, predictions of natural disasters in the near future, the establishment of the possibilities to escape the hazards or risky situations, as well as to prepare the human being for adverse changes, and drawing the different choices as assistance the right decision to be accepted. Many projects, programs, and frameworks are adopted and carried out the separate governments and business makers to common goals and actions for the formation of a friendly environment and measures for reducing undesired climate alterations and cataclysms. The aim of the article is to review the last programs and innovations applied in the mitigation of climate change using AI.
Figures
PreviousNext
Brief Review
Open Access November 04, 2022

An Artificial Intelligence Approach to Manage Crop Water Requirements in South Africa

Abstract Estimation of crop water requirements is of paramount importance towards the management of agricultural water resources, which is a major mitigating strategy against the effects of climate change on food security. South Africa water shortage poses a threat on agricultural efficiency. Since irrigation uses about 60% of the fresh water available, it therefore becomes important to optimise the use of [...] Read more.
Estimation of crop water requirements is of paramount importance towards the management of agricultural water resources, which is a major mitigating strategy against the effects of climate change on food security. South Africa water shortage poses a threat on agricultural efficiency. Since irrigation uses about 60% of the fresh water available, it therefore becomes important to optimise the use of irrigation water in order to maximize crop yield at the farm level in order to avoid wastage. In this study, combined application of an artificial neural network (ANN) and a crop – growth simulation model for the estimation of crop irrigation water requirements and the irrigation scheduling of potatoes at Winterton irrigation scheme, South Africa was investigated. The crop-water demand from planting to harvest date, when to irrigate, the optimum stage in the drying cycle when to apply water and the amount of irrigation water to be applied per time, were estimated in this study. Five feed –forward backward propagation artificial neural network predictive models were developed with varied number of neurons and hidden layers and evaluated. The optimal ANN model, which has 5 inputs, 5 neurons, 1 hidden layer and 1 output was used to predict monthly reference evapotranspiration (ETo) in the Winterton area. The optimal ANN model produced a root-mean-square error (RMSE) of 0.67, Pearson correlation coefficient (r) of 0.97 and coefficient of determination (R2) of 0.94. The validation of the model between the measured and predicted ETo shows a r value of 0.9048. The predicted ETo was one of the input variables into a crop growth simulation model, called CROPWAT. The results indicated that the total crop water requirement was 1259.2 mm/decade and net irrigation water requirement was 1276.9 mm/decade, spread over a 5-day irrigation time during the entire 140 days of cropping season for potatoes. A combination of the artificial neural networks and the crop growth simulation models have proved to be a robust technique for estimating crop irrigation water requirements in the face of limited or no daily meteorological datasets.
Figures
PreviousNext
Article
Open Access July 10, 2022

Digital Therapeutics in Oncology: A Better Outlook for Cancer Patients in the Future

Abstract Digital therapeutics (DTx) is an evidence-based treatment that makes use of high-quality software. As many healthcare systems confront increasing expectations for quality results, the need for digital medications is steadily growing in the clinical arena. To ensure that patients are supported during chemotherapy and that needless hospital visits are avoided, digital therapeutics must be integrated [...] Read more.
Digital therapeutics (DTx) is an evidence-based treatment that makes use of high-quality software. As many healthcare systems confront increasing expectations for quality results, the need for digital medications is steadily growing in the clinical arena. To ensure that patients are supported during chemotherapy and that needless hospital visits are avoided, digital therapeutics must be integrated into the cancer care pathway. Oncology patients are usually immunocompromised die to their disease and treatment, rendering them more susceptible to infection than the general population. As a result, visiting to a hospital might endanger their health. In addition, when cancer patients and survivors return home after treatment, digital health interventions provide them with the tools they need to manage their illness and its side effects in the privacy of their own homes. Considering the increasing prevalence of cancer patients and the solution that digital therapeutics has to offer in oncology, its future looks promising. This review article aims to summarize the existing companies in this domain, while evaluating the prospects as well.
Review Article
Open Access April 28, 2022

Analysis of Network Modeling for Real-world Recommender Systems

Abstract Nowadays, recommendation systems are existing everywhere in the internet world, online people are presented with the required needs not just for actual physical products, but also for several other things such as songs, places, books, friends, movies, and many more requirements. Most of the systems are developed with the basic collaborative and hybrid filtering, where the people or users are [...] Read more.
Nowadays, recommendation systems are existing everywhere in the internet world, online people are presented with the required needs not just for actual physical products, but also for several other things such as songs, places, books, friends, movies, and many more requirements. Most of the systems are developed with the basic collaborative and hybrid filtering, where the people or users are recommended items that the choices are based on the right preferences of other people by applying the machine intelligence strategies. In this research, the importance of network modeling is analyzed in solving real-world problems.
Figures
PreviousNext
Article
Open Access December 27, 2021

A Comparative Study for Recommended Triage Accuracy of AI Based Triage System MayaMD with Indian HCPs

Abstract Artificial intelligence (AI) based triage and diagnostic systems are increasingly being used in healthcare. Although these online tools can improve patient care, their reliability and accuracy remain variable. We hypothesized that an artificial intelligence (AI) powered triage and diagnostic system (MayaMD) would compare favorably with human doctors with respect to triage and diagnostic accuracy. [...] Read more.
Artificial intelligence (AI) based triage and diagnostic systems are increasingly being used in healthcare. Although these online tools can improve patient care, their reliability and accuracy remain variable. We hypothesized that an artificial intelligence (AI) powered triage and diagnostic system (MayaMD) would compare favorably with human doctors with respect to triage and diagnostic accuracy. We performed a prospective validation study of the accuracy and safety of an AI powered triage and diagnostic system. Identical cases were evaluated by an AI system and individual Indian healthcare practitioners (HCPs) to draw comparison for accuracy and safety. The same cases were validated with the help of consensus received from an expert panel of 3 doctors. These cases in the form of clinical vignettes were provided by an expert medical team. Overall, the study showed that the MayaMD AI based platform for virtual triage was able to recommend the most appropriate triage ensuring patient safety. In fact, the accuracy of triage recommendation by MayaMD was significantly better than that provided by individual HCPs (74% vs. 91.67%, p=0.04) with consensus being used as standard.
Figures
PreviousNext
Article
Open Access October 19, 2021

A Ligthweight Wayfinding Assistance System for IoT Applications

Abstract In this paper, we propose to design an indoor sign detection system for industry 4.0. In order to implement the proposed system, we proposed a lightweight deep learning-based architecture based on MobileNet which can be run on embedded devices used to detect and recognize indoor landmarks signs in order to assist blind and sighted during indoor navigation. We apply various operations in order to [...] Read more.
In this paper, we propose to design an indoor sign detection system for industry 4.0. In order to implement the proposed system, we proposed a lightweight deep learning-based architecture based on MobileNet which can be run on embedded devices used to detect and recognize indoor landmarks signs in order to assist blind and sighted during indoor navigation. We apply various operations in order to minimize the network size as well as computation complexity. Internet of things (IoT) presents a connection between internet and the surroundings objects. IoT is characterized to connect physical objects with their numerical identities and enables them to connect with each other. This technique creates a kind of bridge between the physical world and the virtual world. The paper provides a comprehensive overview of a new method for a set of landmark indoor sign objects based on deep convolutional neural network (DCNN) for internet of things applications.
Figures
PreviousNext
Article
Open Access October 17, 2021

Understanding Traffic Signs by an Intelligent Advanced Driving Assistance System for Smart Vehicles

Abstract Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a [...] Read more.
Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a huge number of sensors and processing units to provide a complete overview of the surrounding objects to the driver. In this paper, we introduce a road signs classifier for an ADAS to recognize and understand traffic signs. This classifier is based on a deep learning technique, and, in particular, it uses Convolutional Neural Networks (CNN). The proposed approach is composed of two stages. The first stage is a data preprocessing technique to filter and enhance the quality of the input images to reduce the processing time and improve the recognition accuracy. The second stage is a convolutional CNN model with a skip connection that allows passing semantic features to the top of the network in order to allow for better recognition of traffic signs. Experiments have proved the performance of the CNN model for traffic sign classification with a correct recognition rate of 99.75% on the German traffic sign recognition benchmark GTSRB dataset.
Figures
PreviousNext
Article
Open Access September 04, 2021

Active Fault Tolerant Control of Faulty Uncertain Neutral Time-Delay Systems

Abstract The present paper attempts to investigate the problem of Fault Tolerant Control for a class of uncertain neutral time delay systems. In the first time, we consider an additive control that is based on adding a term to the nominal law when the fault occurs. This approach will be designed in three steps. The first step is fault detection while the second one is fault estimation. For these two steps, [...] Read more.
The present paper attempts to investigate the problem of Fault Tolerant Control for a class of uncertain neutral time delay systems. In the first time, we consider an additive control that is based on adding a term to the nominal law when the fault occurs. This approach will be designed in three steps. The first step is fault detection while the second one is fault estimation. For these two steps, we consider the adaptive observer to guarantee the detection and estimation of the fault. The third step is the fault compensation. Lyapunov method and Linear Matrix Inequality (LMI) techniques were considered to improve the main method. Second, we propose a Pseudo Inverse Method "PIM" and determine the error between the closed loop and the nominal system. Finally, simulation results are presented to prove the theoretical development for an example of an uncertain neutral time delay system.
Figures
PreviousNext
Article
Open Access July 23, 2021

Behavioral Economics and Energy Consumption: Behavioral Data Analysis the Role of Attitudes and Beliefs on Household Electricity Consumption in Iran

Abstract The average electricity consumption in Iranian households is higher than the world average. This can be due to price factors (such as cheap electricity in the country) and non-price factors (such as socio-demographic variables and psychological factors). In this study, non-price factors such as socio-demographic variables and psychological factors in the electricity consumption of urban households [...] Read more.
The average electricity consumption in Iranian households is higher than the world average. This can be due to price factors (such as cheap electricity in the country) and non-price factors (such as socio-demographic variables and psychological factors). In this study, non-price factors such as socio-demographic variables and psychological factors in the electricity consumption of urban households in Tehran were investigated. In this regard, using the theoretical foundations of behavioral economics and the psychology of planned behavior, this issue was analyzed. This study collected information on household electricity consumption behavior through a questionnaire and fieldwork from 2560 Tehran households. Results Using econometric techniques, linear regression was estimated, the dependent variable of which was electricity consumption (45 days in winter 2019) and its independent variables including socio-demographic variables (age, sex, number of household members, income) and The variables of the theory of planned behavior (attitude, mental norms and perceived behavioral control) showed that income and the number of household members have a significant and positive effect on electricity consumption, but gender has no significant effect. Of the psychological variables, only perceived behavioral control has a significant effect on electricity consumption. These results show that the consumer does not have a positive attitude towards saving, and mental and social norms do not encourage him to reduce electricity consumption, and they are not effective in consumption control. Finally, the study results were analyzed using behavioral biases that may cause attitudes and beliefs not to lead to action.
Figures
PreviousNext
Article
Open Access December 27, 2021

Leveraging AI and ML for Enhanced Efficiency and Innovation in Manufacturing: A Comparative Analysis

Abstract The manufacturing industry has embraced modern technologies such as big data, machine learning, and artificial intelligence. This paper examines AI and machine learning developments in the manufacturing industry, comparing current practices and data-driven projects. It aims better to understand these technologies and their potential benefits and challenges. The research identifies opportunities [...] Read more.
The manufacturing industry has embraced modern technologies such as big data, machine learning, and artificial intelligence. This paper examines AI and machine learning developments in the manufacturing industry, comparing current practices and data-driven projects. It aims better to understand these technologies and their potential benefits and challenges. The research identifies opportunities for innovative business solutions and explores industry practices and research results. The paper focuses on implementation rather than technical aspects, aiming to enhance knowledge in this area.
Figures
PreviousNext
Review Article
Open Access August 20, 2022

Advancing Predictive Failure Analytics in Automotive Safety: AI-Driven Approaches for School Buses and Commercial Trucks

Abstract The recent evidence on AI in automotive safety shows the potential to reduce crashes and improve efficiency. Studies used AI techniques like machine learning and predictive analytics models to develop predictive collision avoidance systems. The studies collected data from various sources, such as traffic collision data and shapefiles. They utilized deep learning neural networks and 3D [...] Read more.
The recent evidence on AI in automotive safety shows the potential to reduce crashes and improve efficiency. Studies used AI techniques like machine learning and predictive analytics models to develop predictive collision avoidance systems. The studies collected data from various sources, such as traffic collision data and shapefiles. They utilized deep learning neural networks and 3D visualization techniques to analyze the data. However, there needs to be more research on AI in school bus and commercial truck safety. This paper explores the importance of AI-driven predictive failure analytics in enhancing automotive safety for these vehicles. It will discuss challenges, required data, technologies involved in predictive failure analytics, and the potential benefits and implications for the future. The conclusion will summarize the findings and emphasize the significance of AI in improving driver safety. Overall, this paper contributes to the field of automotive safety and aims to attract more research in this area.
Figures
PreviousNext
Review Article
Open Access August 29, 2022

From Deterministic to Data-Driven: AI and Machine Learning for Next-Generation Production Line Optimization

Abstract The advancement of modern manufacturing is synonymous with the growth of automation. Automation replaces human operators, improves productivity and quality, and reduces costs. However, the initial financial cost and knowledge requirements can be barriers to embracing automation. Manufacturers are now seeking smart manufacturing, known as the fourth industrial revolution. Smart manufacturing goes [...] Read more.
The advancement of modern manufacturing is synonymous with the growth of automation. Automation replaces human operators, improves productivity and quality, and reduces costs. However, the initial financial cost and knowledge requirements can be barriers to embracing automation. Manufacturers are now seeking smart manufacturing, known as the fourth industrial revolution. Smart manufacturing goes beyond automation and utilizes IoT, AI, and big data for optimized production. In a smart factory, production can be linked and controlled innovatively, leading to increased performance, agility, and reduced costs.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Exploring AI Algorithms for Cancer Classification and Prediction Using Electronic Health Records

Abstract Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer [...] Read more.
Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer detection, utilizing the publicly available CBIS-DDSM dataset, which comprises 5,000 images evenly divided between benign and malignant cases. To improve diagnostic accuracy, models such as Gaussian Naïve Bayes (GNB), CNNs, KNN, and MobileNetV2 were assessed employing performance measures including F1-score, recall, accuracy, and precision. The methodology involved data preprocessing techniques, including transfer learning and feature extraction, followed by data splitting for robust model training and evaluation. Findings indicate that MobileNetV2 achieved a highest accuracy99.4%, significantly outperforming GNB (87.2%), CNN (96.7%), and KNN (91.2%). The outstanding capacity of MobileNetV2 to identify between benign and malignant instances was shown by the investigation, which also made use of confusion matrices and ROC curves to evaluate model performance.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

An Effective Predicting E-Commerce Sales & Management System Based on Machine Learning Methods

Abstract Due to influence of Internet, this e-commerce sector has developed rapidly. Most of the online retailing or selling businesses are seeking for way for predicting their products demand. Sales forecasting may help retailers develop a sales strategy that will enhance sales and attract more money and investment. The current research work puts forward a machine learning framework to forecast E-commerce [...] Read more.
Due to influence of Internet, this e-commerce sector has developed rapidly. Most of the online retailing or selling businesses are seeking for way for predicting their products demand. Sales forecasting may help retailers develop a sales strategy that will enhance sales and attract more money and investment. The current research work puts forward a machine learning framework to forecast E-commerce sales for strategic management using a dataset of E-commerce transactions. With 70 percent of the data for train and 30 percent for test, three models were produced, namely, Random Forest, Decision Tree, and XGBoost. In order to evaluate the models, performance measures inclusive of R-squared (R²) and Root Mean Squared Error (RMSE) were employed. Thus, the XGBoost model was the most accurate in marketing predictive capabilities for E-commerce sales with the R² score of 96.3%. This has demonstrated the increased capability of XGBoost algorithm to forecast E-commerce monthly sales more accurately than other models and can assist decision makers for managing inventory and arriving smart and quick decisions in this rapidly growing E-commerce market. The findings reiterate the importance of using advanced analytics in order to drive effectiveness and customer experience within E-commerce sector.
Figures
PreviousNext
Review Article
Open Access October 15, 2022

Big Data and AI/ML in Threat Detection: A New Era of Cybersecurity

Abstract The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even [...] Read more.
The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even under pressure from regulatory boards, have strived to harness the power of data and leverage it to enhance safety and security, maximize performance, and mitigate risks. However, the adversaries themselves have capitalized on the unequal battle of big data and artificial intelligence to inflict widespread chaos. Therefore, the demand for big data analytics and AI/ML for high-fidelity intelligence, surveillance, and reconnaissance is at its highest. Today, in the cybersecurity realm, the detection of adverse incidents poses substantial challenges due to the sheer variety, volume, and velocity of deep packet inspection data. State-of-the-art detection techniques have fallen short of detecting the latest attacks after a big data breach incident. On the other hand, computational intelligence techniques such as machine learning have reignited the search for solutions for diverse monitoring problems. Recent advancements in AI/ML frameworks have the potential to analyze IoT/edge-generated big data in near real-time and assist risk assessment and mitigation through automated threat detection and modeling in the big data and AI/ML domain. Industry best practices and case studies are examined that endeavor to showcase how big data coupled with AI/ML unlocks new dimensions and capabilities in improved vigilance and monitoring, prediction of adverse incidents, intelligent modeling, and future uncertainty quantification by data resampling correction. All of these avenues lead to enhanced robustness, security, safety, and performance of industrial processes, computing, and infrastructures. A view of the future and how the potential threats due to the misuse of new technologies from bandwidth to IoT/edge, blockchain, AI, quantum, and autonomous fields is discussed. Cybersecurity is again playing out at a pace set by adversaries with low entry barriers and debilitating tools. The need for innovative solutions for defense from the emerging threat landscape, harnessing the power of new technologies and collaboration, is emphasized.
Figures
PreviousNext
Article
Open Access December 27, 2021

Leveraging AI in Urban Traffic Management: Addressing Congestion and Traffic Flow with Intelligent Systems

Abstract Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. [...] Read more.
Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. From an urban transportation standpoint, an immediate consideration on one hand is monitoring traffic conditions and demand cycles, while on the other hand inducing flow modifications that benefit the traffic network and mitigate congestion. Embedded and centralized control systems that characterize modern traffic management systems extract traffic conditions specific to their regions but lack communication between networks. Moreover, innovative methods are required to provide more accurate up-to-date traffic forecasts that characterize real-world traffic dynamics and facilitate optimal traffic management decisions. In this chapter, we briefly outline the main difficulties and complexities in modeling, managing, and forecasting traffic dynamics. We also compare various conventional and modern Intelligent Transportation Strategies in terms of accuracy and applicability, their performance, and potential opportunities for optimization of multimodal traffic flow and congestion reduction. This chapter introduces various proposed data-driven models and tools employed for traffic flow prediction and management, investigating specific strategies' strengths, weaknesses, and benefits in addressing various real-world traffic management problems. We describe that the design phase of dependable Intelligent Transportation Systems bears unique requirements in terms of the robustness, safety, and response times of their components and the encompassing system model. Furthermore, this architectural blueprint shares similarities with distributed coordinate searching and collective adaptive systems. Town size-independent models induce systemic performance improvements through reconfigurable embedded functionality. These AI techniques feature elaborate anytime planner-engagers ensuring near-optimal performances in an unbiased behavior when the model complexity is varied. Sustainable models minimize congestion during peaks, flooding, and emergency occurrences as they adhere to area-specific regulations. Security-aware and fail-safe traffic management systems relinquish reasonable assurances of persistent operation under various environmental settings, to acknowledge metropolis and complex traffic junctions. The chapter concludes by outlining challenges, research questions, and future research paths in the field of transportation management.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Sustainability in Construction: Exploring the Development of Eco-Friendly Equipment

Abstract The equipment used in the construction industry is usually associated with a high impact on the environment. Although sustainable design has shown to be a main player among the initiatives focused on reducing environmental impact, it has been driven by the workers and processes, leaving the equipment endeavors in more restrictive and later stages. The equipment industry has been a constant target [...] Read more.
The equipment used in the construction industry is usually associated with a high impact on the environment. Although sustainable design has shown to be a main player among the initiatives focused on reducing environmental impact, it has been driven by the workers and processes, leaving the equipment endeavors in more restrictive and later stages. The equipment industry has been a constant target of environmental standards and economic pressure, but the increasing technological development allows it to respond to sustainability and safety expectations while enhancing its performance. However, there are still several limitations that lead this sector to be one of the last to reach upgrading levels in terms of development. A study identified some gaps in the equipment design that require a greater effort to effectively support the workers and companies towards sustainable construction. This chapter is based on a study aiming to understand the consolidated knowledge of technologically sustainable equipment design and to identify the challenges left for its full development. The findings support the development of innovative eco-friendly equipment, taking into consideration sustainable materials and product guidelines, as well as green economy initiatives. It also supports complex system approaches and safety by design specificities to establish a corporate knowledge of sustainable equipment and align it with the new regulations of the construction industry. The chapter introduces the context of construction equipment in terms of new challenges when faced with the need to provide construction work with a greater capacity for safety, from an environmental and energy efficiency perspective, and within the paradigm of sustainability. Then, it presents the concept of sustainable equipment considering its principles, followed by a characterization of the agents involved in its life cycle.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Financial Implications of Predictive Analytics in Vehicle Manufacturing: Insights for Budget Optimization and Resource Allocation

Abstract Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented [...] Read more.
Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented questions need to at least partially guide the decisions in the planning phase of data science projects. Data-driven approaches will play an increasingly important role, but only a few of the firms that were confident performed logistic regression models for predictive maintenance. Also, from the available knowledge, data-driven classification models connecting vehicle component failures and the occurrence of delays at the assembly line have not been published. This paper utilizes a real-world data-driven approach using classification models in predictive analytics by vehicle manufacturers and thereby links the financial implications of such data science projects to their results. We expand the existing literature on predictive maintenance and possess a unique dataset of newly launched series of vehicles, presented as-is. Our research context is of interest to researchers and practitioners in the automotive industry that manage and plan the final vehicle assembly with just-in-time principles, factoring the consequences of component failures on the assembly process. Key findings of this paper highlight that while minor tweaking of the models is possible, their potential input in decision-making processes for budget optimization is limited.
Figures
PreviousNext
Review Article
Open Access October 29, 2022

Neural Networks for Enhancing Rail Safety and Security: Real-Time Monitoring and Incident Prediction

Abstract The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the [...] Read more.
The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the improvements in monitoring can be achieved using artificial intelligence algorithms such as neural networks. Neural networks have been used to achieve real-time incident identification in monitoring the track quality in terms of classifying the graphical outputs of an ultrasonic system working with the rails and track bed, to predict incidents on the rail infrastructure due to transmission channels becoming blocked, and also to attempt scheduling preemptive and preventative maintenance. In terms of forecasting incidents and accidents on board the trains, neural networks have been used to model passenger behavior and optimize responses during a train station evacuation. In tackling the incidents and accidents occurring on rail transport, we contribute with two methodologies to detect anomalies in real-time and identify the level of security risk: at the maintenance level with personnel operating along the railways, and onboard passenger trains. These methodologies were evaluated on real-world datasets and shown to be able to achieve a high accuracy in the results. The results generated from these case studies also reveal the potential for network-wide applications, which could enhance security and safety on railway networks by offering the possibility of better managing network disruptions and more rapidly identifying security issues. The speed and coverage of the information generated through the implementation of these methodologies have implications in utilizing prediction for decision support and enhancing safety and security on board the rail network.
Figures
PreviousNext
Review Article
Open Access November 16, 2023

Innovations in Agricultural Machinery: Assessing the Impact of Advanced Technologies on Farm Efficiency

Abstract Progress in the development and adoption of technological innovations is instrumental in enhancing the efficiency of production systems across the globe. Through the introduction of cost-efficient and high-performing technologies, countries can both reduce the resource use intensity of their economies and boost the global supply of essential products. The focus of this study is to analyze the [...] Read more.
Progress in the development and adoption of technological innovations is instrumental in enhancing the efficiency of production systems across the globe. Through the introduction of cost-efficient and high-performing technologies, countries can both reduce the resource use intensity of their economies and boost the global supply of essential products. The focus of this study is to analyze the application of advanced machinery and mechanisms within the agricultural sector, a primary industry that acts as a major contributor to the gross domestic product (GDP) of many nations. Specifically, this paper provides an in-depth review of the latest impact assessments based on analytical and modeling tools conducted on agricultural machinery and production technologies. Our findings highlight the positive role played by scientific progress and innovation in driving the competitiveness, growth and improved sustainability of the agricultural sector. Over the years, advanced technologies have accelerated the development and modernization of machinery, equipment, and processes in farming. Typically, modern machinery and equipment have enabled large-scale production on farms, enhancing the cost-efficient use of both land and labor, as well as the capacity and timeliness in performing essential agricultural operations. The rapid diffusion of technical advancements has further contributed to resource savings, productivity growth, and the overall transformation of agricultural value chains. Accordingly, the implementation of appropriate enabling conditions is of vital importance in encouraging the widespread integration of technologies in agriculture, not only boosting productivity along the agri-food chain but also yielding widespread social, economic, and environmental benefits.
Figures
PreviousNext
Review Article
Open Access October 30, 2022

Towards Autonomous Analytics: The Evolution of Self-Service BI Platforms with Machine Learning Integration

Abstract Self-service business intelligence (BI) platforms have become essential applications for exploring, analyzing, and visualizing business data in various domains. Here, we envisage that the business intelligence platform will perform automatic and autonomous data analytics with minimal to no user interaction. We aim to offer a data-driven, intelligent, and scalable infrastructure that amplifies the [...] Read more.
Self-service business intelligence (BI) platforms have become essential applications for exploring, analyzing, and visualizing business data in various domains. Here, we envisage that the business intelligence platform will perform automatic and autonomous data analytics with minimal to no user interaction. We aim to offer a data-driven, intelligent, and scalable infrastructure that amplifies the advantages of BI systems and discovers hidden and complex insights from very large business datasets, which a business analyst can miss during manual exploratory data analysis. Towards our future vision of autonomous analytics, we propose a collective machine learning model repository with an integration layer for user-defined analytical goals within the BI platform. The proposed architecture can effectively reduce the cognitive load on users for repetitive tasks, democratizing data science expertise across data workers and facilitating a less experienced user community to develop and use advanced machine learning and statistical algorithms.
Figures
PreviousNext
Review Article
Open Access November 05, 2022

Application of Neural Networks in Optimizing Health Outcomes in Medicare Advantage and Supplement Plans

Abstract The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, [...] Read more.
The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, personalized treatment recommendations, and risk stratification. This paper explores the application of neural networks in enhancing health outcomes within the context of Medicare Advantage and Supplement plans. We review how deep learning models can be leveraged to predict patient risk, optimize resource allocation, and identify at-risk populations for preventive interventions. Additionally, we discuss the potential for neural networks to improve claims processing, reduce fraud, and streamline administrative burdens. By integrating various data sources, including medical records, claims data, and demographic information, neural networks enable more accurate and efficient decision-making processes. Ultimately, this approach can lead to better patient care, reduced healthcare costs, and improved satisfaction for beneficiaries of these programs. The paper concludes by highlighting the current limitations, ethical considerations, and future directions for AI adoption in the Medicare Advantage and Supplement sectors.
Figures
PreviousNext
Review Article
Open Access November 16, 2023

Zero Carbon Manufacturing in the Automotive Industry: Integrating Predictive Analytics to Achieve Sustainable Production

Abstract This charge-ahead paper suggests that transitioning the automotive industry towards a zero-carbon ecosystem from material to end-of-life can be accomplished through disruptive zero-carbon manufacturing in the broad area of all-electric vehicle production technology. To accomplish zero carbon emission automotive manufacturing in the vehicle assembly domain, future paradigms must converge on the [...] Read more.
This charge-ahead paper suggests that transitioning the automotive industry towards a zero-carbon ecosystem from material to end-of-life can be accomplished through disruptive zero-carbon manufacturing in the broad area of all-electric vehicle production technology. To accomplish zero carbon emission automotive manufacturing in the vehicle assembly domain, future paradigms must converge on the decoupling of carbon dioxide emissions from automobile manufacturing and use the design, processing, and manufacturing conditions. The envisioned zero carbon emission vehicle manufacturing domain consists of two complementary components: (a) making more efficient use of energy and (b) reducing carbon in energy use. This paper presents the status of key scientific and technological advancements to bring the manufacturing model of today to a zero-carbon ecosystem for the entire automotive industry of tomorrow. This paper suggests the groundbreaking application of dynamic and distributed predictive scheduling algorithms and open sensing and visualization technology to meet the zero carbon emission vehicle manufacturing goals. Power-aware high-performance computing clusters have recently become a viable solution for sustainable production. Advances in scalable and self-adaptive monitoring, predictive analytics, timeline-based machine learning, and digital replica of cyber-physical systems are also seen co-evolving in the zero carbon manufacturing future. These methods are inspired by initiatives to decouple gross domestic product growth and energy-related carbon dioxide emissions. Stakeholders could co-design and implement shared roadmaps to transition the automotive manufacturing sector with relevant societal and environmental benefits. The automated mobility sector offers a program, an industry-leading example of transforming an automotive production facility to carbon neutrality status. The conclusions from this paper challenge automotive manufacturers to engage in industry offsetting and carbon tax programs to drive continuous improvement and circular vehicle flows via a multi-directional zero-carbon smart grid.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Enhancing Pharmaceutical Supply Chain Efficiency with Deep Learning-Driven Insights

Abstract The growing complexity of the operating environment urges pharmaceutical innovation. This essay addresses the need for the integration of advanced technologies in the pharmaceutical supply chain. It justifies the value proposition and presents a concrete use case for the integration of deep learning insights to make data-driven decisions. The supply chain has always been a priority for the [...] Read more.
The growing complexity of the operating environment urges pharmaceutical innovation. This essay addresses the need for the integration of advanced technologies in the pharmaceutical supply chain. It justifies the value proposition and presents a concrete use case for the integration of deep learning insights to make data-driven decisions. The supply chain has always been a priority for the pharmaceutical industry; research and development recognizes companies' increasing investment in big data strategies, with plans for a CAGR in big data tool adoption. The work presented herein has a preliminary explorative character to recuperate and integrate evidence from partly overlooked practical experience and know-how. The practical relevance of the essay is directed toward practitioners in pharmaceutical production, supply chain management, logistics, and regulatory agencies. The literature has shown a long-term concern for enhanced performance in the pharmaceutical supply chain network. This essay demonstrates the application of deep learning-driven insights to reveal non-evident flow dependencies. The main aim is to present a comprehensive insight into deep learning-driven decision support. The supply chain is portrayed in a holistic manner, seeking end-to-end visibility. Implications for public policy are discussed, such as data equity: many countries are protecting their populations and economic growth by building resilience and efficiency to ensure the capacity to move goods across supply chains. The implementation strategy is covered. The combined reduction of variability, efficiency as matured richness, reliability (on stochastic flows and their understanding through deep learning and data), and system noise (increased dampening through the inclusiveness of all stakeholders) results in increased responsiveness of supply chains for pharmaceutical products. Future work involves the integration of external data, closing the loop between planning and its application in reality.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Predictive Analytics and Deep Learning for Logistics Optimization in Supply Chain Management

Abstract Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the [...] Read more.
Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the operations of a supply chain. An approach is presented on how predictive analytics can be used to improve logistics operations. In order to analyze big data in logistics effectively, an artificial intelligence computational technique, specifically deep learning, is employed. Two case studies are illustrated to demonstrate the practical employability of the proposed technique. This reveals the power and potential of using predictive analytics in logistics to project various KPI values ahead in the future based on the contemporary data from the logistics operations; sheds light on the innovative technique of employing deep learning through deep learning-based predictive analytics in logistics; suggests incorporating innovative techniques like deep learning with predictive analytics to develop an accurate forecasting technique in logistics and optimize operations and prevent disruption in the supply chain. The network of supply chains has become more complex, necessitating the need for the latest technological advancements. The sectors that have gained a fair amount of attention for the application of technology to optimize their operations are manufacturing, healthcare, aerospace, and the automotive industry. A little attention has been diverted to the logistics sector; many describe how analytics and artificial intelligence can be used in the logistics sector to achieve higher optimization. Currently, significant research has been done in optimizing logistics operations. Nevertheless, with the explosive volume of historical data being produced by the logistics operations of an organization, there is a great opportunity to learn valuable insights from the data accumulated over time for more long-term strategic planning. To develop the logistics operations in an organization, the use of historical data is essential to understand the trends in the operations. For example, regular maintenance planning and resource allocation based on trends are long-term activities that will not affect logistics operations immediately but can affect the business’s strategic planning in the long run. A predictive analysis technique employed on historical data of logistics can narrow down conclusions based on the future trends of logistics operations. Thus, the technique can be used to prevent the disruption of the supply chain.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

Advancing Pain Medicine with AI and Neural Networks: Predictive Analytics and Personalized Treatment Plans for Chronic and Acute Pain Managements

Abstract There is a growing body of evidence that the number of individuals suffering from chronic and acute pain is under-reported and the burden of the veteran, aging, athletic, and working populations is rising. Current pain management is limited by our capacity to collaborate with individuals continuing normal daily functions and self-administration of pain treatments outside of traditional healthcare [...] Read more.
There is a growing body of evidence that the number of individuals suffering from chronic and acute pain is under-reported and the burden of the veteran, aging, athletic, and working populations is rising. Current pain management is limited by our capacity to collaborate with individuals continuing normal daily functions and self-administration of pain treatments outside of traditional healthcare appointments and hospital settings. In this review, the current gap in clinical care for real-time feedback and guidance with pain management decision-making for chronic and post-operative pain treatment is defined. We examine the recent and future applications for predictive analytics of opioid use after surgery and implementing real-time neural networks for personalized pain management goal setting for particular individuals on the path to discharge to normal function. Integration of personalized neural networks with longitudinal data may enable the development of future treatment personalizations paired with electrical simulations.
Figures
PreviousNext
Review Article
Open Access December 27, 2023

Leveraging Artificial Intelligence to Enhance Supply Chain Resilience: A Study of Predictive Analytics and Risk Mitigation Strategies

Abstract The management of supply chains is increasingly complex. This study provides a comparative analysis of the cost-benefit analysis for managing various risks. It identifies the financial implications of leveraging artificial intelligence in supply chains to better address risk. Empirical results show a business case for managing some sources of risk more proactively facilitated through predictive [...] Read more.
The management of supply chains is increasingly complex. This study provides a comparative analysis of the cost-benefit analysis for managing various risks. It identifies the financial implications of leveraging artificial intelligence in supply chains to better address risk. Empirical results show a business case for managing some sources of risk more proactively facilitated through predictive modeling techniques offered by AI. Across investigation streams, the use of AI results in an average total cost saving ranging from 41,254 to 4,099,617. Findings from our research can be used to inform managers and theorists about the implications of integrating AI technologies to manage risks in the supply chain. Our work also highlights areas for future research. Given the growing interest in studying sub-second forecasting, our research could be a point of departure for future investigations aimed at considering the impact of forecasting horizons such as an intra-day basis. We formulate a conceptual framework that considers how and to what extent performance evaluation metrics vary according to differences in the fidelity of predictive models and factor importance for identifying risks. We also utilize a mixed-method approach to demonstrate the applicability of our ideas in practice. Our results illustrate the financial implications of integrating AI predictive tools with business processes. Results suggest that real-world companies can circumvent inefficiencies associated with trying to manage many classes of risk via the use of AI-enhanced predictive analytics. As managers need to justify investment to top management, our work supports decision-making by providing a means of conducting a trade-off analysis at the tactical level.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

Building Scalable and Secure Cloud Architectures: Multi-Region Deployments, Auto Scaling, and Traffic Management in Azure and AWS for Microservices

Abstract The last few years have seen an increased adoption of cloud infrastructure, which has in turn led to a growth in large-scale distributed architectures in data centers to accommodate cloud resource elasticity and resiliency better. Selecting the right approach to build secure, scalable, and reliable cloud infrastructure within a budget is always a challenge. This text focuses on offering practical [...] Read more.
The last few years have seen an increased adoption of cloud infrastructure, which has in turn led to a growth in large-scale distributed architectures in data centers to accommodate cloud resource elasticity and resiliency better. Selecting the right approach to build secure, scalable, and reliable cloud infrastructure within a budget is always a challenge. This text focuses on offering practical solutions for designing and building a secure, scalable, and reliable cloud-based infrastructure where auto-scaling and multi-region deployments are the two key approaches to offer high availability. It covers designing secure and scalable microservices using cloud platforms. The content will provide an understanding of public cloud architecture, the design of microservices running on the cloud, and also the design patterns used in the cloud era. With real-world examples, you will learn how microservices can enable scalable distributed systems. Furthermore, you will be walked through multi-region deployments, auto-scaling, and traffic management in cloud environments, using a sample environment setup and useful tips and tricks for monitoring. Finally, you will see a mock implementation of cloud infrastructure on-premise for a private cloud or single-node cloud. By the end of this text, you will be able to build, manage, and deploy a highly scalable and reliable cloud-ready solution [1].
Figures
PreviousNext
Review Article
Open Access December 27, 2023

Ensuring High Availability and Resiliency in Global Deployments: Leveraging Multi-Region Architectures, Auto Scaling, and Traffic Management in Azure and AWS

Abstract Modern organizations leverage highly distributed, global deployments to provide high availability and resiliency for cloud-first applications. By hosting these applications across multiple geographic locations and relying on highly available services, organizations can prevent disruption to their business and reduce complexity by employing the scale of infrastructure offered by major cloud [...] Read more.
Modern organizations leverage highly distributed, global deployments to provide high availability and resiliency for cloud-first applications. By hosting these applications across multiple geographic locations and relying on highly available services, organizations can prevent disruption to their business and reduce complexity by employing the scale of infrastructure offered by major cloud providers. Global deployments in the cloud are built on well-known models such as failover, load balancing, and scalability. However, traditional methods used to recover from regional failure—while effective—can be complex. Typical multi-region recovery and high availability system architectures have latency and cost risks that should be considered when facing other limitations such as deployment models in the cloud. This document describes the different traffic management techniques that can be applied to multi-region strategies, focusing on trade-offs and costs. The introduction of new traffic management techniques being applied to the traditional global architectures now allows organizations to adopt cloud services more efficiently. Traffic management is much more straightforward in some environments, while others have started to leverage their traffic management platform via routing. In multi-region deployments, active-active and active-passive are the most common architectural models, allowing organizations to seamlessly handle failover, scalability, and global distribution based on business goals and requirements. However, traffic management for these infrastructures is critical to ensure just data distribution and efficiency, maintaining costs under control and workloads rerouted when necessary. Using the new traffic management techniques will allow organizations to evolve system architectures easily based on business requirements, taking advantage of cost benefits from multiple infrastructures. In these scenarios, traffic management becomes a crucial backbone of success to ensure that traffic is being efficiently and intelligently distributed [1].
Figures
PreviousNext
Review Article
Open Access December 17, 2024

Disaster Recovery and Application Security in Microservices: Exploring Kubernetes, Application Gateways, and Cloud Solutions for High Availability

Abstract Unfortunately, it is not disaster recovery, high availability, or cloud technologies that are inherently difficult to understand, but rather the action of implementing them for software applications that is difficult. The unique method of implementation for a microservices architecture is explored. Regulatory compliance doesn’t stop just because an effective disaster recovery requirement is tough [...] Read more.
Unfortunately, it is not disaster recovery, high availability, or cloud technologies that are inherently difficult to understand, but rather the action of implementing them for software applications that is difficult. The unique method of implementation for a microservices architecture is explored. Regulatory compliance doesn’t stop just because an effective disaster recovery requirement is tough to satisfy for infrastructure unique to sleek microservices. The high-availability location transparency bliss offered by a cloud solution is appealing to a security engineering department. However, the headache starts when the technology presents a handful of undesirable surprises that leak RESTful microservices to the outside world. These are the challenges that post-SOA cloud-resident robustly scalable applications will need to address and overcome. The goal is to explore several popular methods of accomplishing these tough objectives so that engineers can further research the most practical solution. An innovative implementation that leverages Service Bus relays as an elegant disaster recovery solution while enforcing a strict subnet where RESTful microservices solely live will be discussed. The curiosity lies in the atypical experimentation beyond basic gateways and the facility of using such simplicity while still answering day-to-day software development infrastructure challenges for applications we build. Resilient full-service web proxy service crashes and delivery latency switches by harnessing the microservices pod health will also be discussed [1].
Figures
PreviousNext
Review Article
Open Access December 27, 2023

Leveraging Machine Learning Techniques for Predictive Analysis in Merger and Acquisition (M&A)

Abstract M&A is a strategic concept of business growth through consolidation, gaining market access, increasing strategic positions, and increasing operational efficiency. To understand the dynamics of M&A, this paper looks at aspects such as targeted firm identification, evaluation, bidding for the target firm, and post-acquisition integration. All forms of M&A, including horizontal, [...] Read more.
M&A is a strategic concept of business growth through consolidation, gaining market access, increasing strategic positions, and increasing operational efficiency. To understand the dynamics of M&A, this paper looks at aspects such as targeted firm identification, evaluation, bidding for the target firm, and post-acquisition integration. All forms of M&A, including horizontal, vertical, conglomerate, and acquisitions, are discussed in terms of goals and values, including synergy, cost reduction, competitive advantages, and access to better technology. However, issues such as cultural assimilation, adhesion to regulations, and calculating an inaccurate value are also resolved. The paper then goes deeper to provide insight into how predictive analytics applies to M&A, using ML to improve decision-making with forecasting benefits. Including healthcare, education, and construction industries, the presented predictive models using regression analysis, neural networks, and ensemble techniques help to make decisions. Through time series and real-time data, PDA enables sound M&A strategies, effective risk management and smooth integration.
Figures
PreviousNext
Review Article
Open Access December 27, 2023

Understanding the Fundamentals of Digital Transformation in Financial Services: Drivers and Strategic Insights

Abstract The current financial services sector is realising considerable changes in its operations due to development in technology and embracing of digital platforms. This evolution is changing the established concepts of business, consumers and channels of delivery of services. Financial services firms are changing the way they work through digital transformation due to developments in technology, [...] Read more.
The current financial services sector is realising considerable changes in its operations due to development in technology and embracing of digital platforms. This evolution is changing the established concepts of business, consumers and channels of delivery of services. Financial services firms are changing the way they work through digital transformation due to developments in technology, changes in customer needs, and an increase in emphasis on sustainability. Understanding the opportunities, risks, and new trends in digital transformation is the focus of this paper. Opportunities include efficient real-time decision-making processes, increased transparency and better process controls, which are balanced by the threats of change management, dubious organization-technology fit, and high implementation costs. The study also examines recent advancements, including the application of machine learning and artificial intelligence, developments in mobile and online banking, integration of blockchain, and increasing focus on security and personalised banking. A literature review yields some findings from different studies on rural financial services, the evolution of the blockchain, drivers of digital transformation, cloud-based learning approaches, and emerging sustainability practices. All of these results suggest that more strategic planning, analytics, and more focus on ensuring that organisational objectives are met with transformations should be pursued. Hence, this research findings add to the existing literature in determining how innovative and digital technologies are likely to transform the financial services sector and advance sustainability.
Figures
PreviousNext
Review Article
Open Access November 19, 2022

Analyzing Behavioral Trends in Credit Card Fraud Patterns: Leveraging Federated Learning and Privacy-Preserving Artificial Intelligence Frameworks

Abstract We investigate and analyze the trends and behaviors in credit card fraud attacks and transactions. First, we perform logical analysis to find hidden patterns and trends, then we leverage game-theoretical models to illustrate the potential strategies of both the attackers and defenders. Next, we demonstrate the strength of industry-scale, privacy-preserving artificial intelligence solutions by [...] Read more.
We investigate and analyze the trends and behaviors in credit card fraud attacks and transactions. First, we perform logical analysis to find hidden patterns and trends, then we leverage game-theoretical models to illustrate the potential strategies of both the attackers and defenders. Next, we demonstrate the strength of industry-scale, privacy-preserving artificial intelligence solutions by presenting the results from our recent exploratory study in this respect. Furthermore, we describe the intrinsic challenges in the context of developing reliable predictive models using more stringent protocols, and hence the need for sector-specific benchmark datasets, and provide potential solutions based on state-of-the-art privacy models. Finally, we conclude the paper by discussing future research lines on the topic, and also the possible real-life implications. The paper underscores the challenges in creating robust AI models for the banking sector. The results also showcase that privacy-preserving AI models can potentially augment sharing capabilities while mitigating liability issues of public-private sector partnerships [1].
Figures
PreviousNext
Review Article
Open Access November 16, 2022

AI-Driven Automation in Monitoring Post-Operative Complications Across Health Systems

Abstract Artificial intelligence systems have been previously used to predict post-operative complications in small studies and single institutions. Here we developed a robust artificial intelligence model that predicts the risk of having cardiac, pulmonary, thromboembolic, or septic complications after elective, non-cardiac, non-ambulatory surgery. We combined structured and unstructured electronic health [...] Read more.
Artificial intelligence systems have been previously used to predict post-operative complications in small studies and single institutions. Here we developed a robust artificial intelligence model that predicts the risk of having cardiac, pulmonary, thromboembolic, or septic complications after elective, non-cardiac, non-ambulatory surgery. We combined structured and unstructured electronic health record data from 3.5 million surgical encounters from 25 medical centers between 2009 and 2017. Our neural network model predicted postoperative comorbidities 15 to 80 times faster than classical models. As such, our model can be used to assess the risk of having a specific complication postoperatively in a fraction of a second. With our model, we believe clinicians will be able to identify high-risk surgical patients and use their good judgment to mitigate upcoming risks, ultimately improving patient outcomes [1].
Figures
PreviousNext
Case Report
Open Access December 29, 2019

Explainable Analytics in Multi-Cloud Environments: A Framework for Transparent Decision-Making

Abstract The multitude of services and resources available in multi-cloud environments has increased the importance of analytics applications in cloud brokering. These applications can orchestrate services and resources that reside in different domains and require inputs that a single cloud provider could not easily acquire. Yet, despite their distinct characteristics, multi-cloud analytics users have no [...] Read more.
The multitude of services and resources available in multi-cloud environments has increased the importance of analytics applications in cloud brokering. These applications can orchestrate services and resources that reside in different domains and require inputs that a single cloud provider could not easily acquire. Yet, despite their distinct characteristics, multi-cloud analytics users have no voice in the ranking of the services in brokerage marketplaces. In this chapter, we introduce the concept and propose the implementation of explainable analytics to increase transparency and user satisfaction in multi-cloud environments. The criteria that we have identified and measured in order to summarize them in explainable results allow cloud users to acquire an understanding of the ranking rules, a crucial requirement in trustful decision-making. Our proposal accounts for a set of regulations for intelligent systems and targets their specific adaptation and use in multi-cloud environments.
Figures
PreviousNext
Review Article
Open Access December 29, 2020

A Deep Learning Architectures for Enhancing Cyber Security Protocols in Big Data Integrated ERP Systems

Abstract Deep learning approaches are very useful to enhance cybersecurity protocols for industry-integrated big data enterprise resource planning systems. This research study develops deep learning architectures of variational autoencoder, sparse autoencoder, and deep belief network for detecting anomalies, fraud, and preventing cybersecurity attacks. These cybersecurity issues occur in finance, human [...] Read more.
Deep learning approaches are very useful to enhance cybersecurity protocols for industry-integrated big data enterprise resource planning systems. This research study develops deep learning architectures of variational autoencoder, sparse autoencoder, and deep belief network for detecting anomalies, fraud, and preventing cybersecurity attacks. These cybersecurity issues occur in finance, human resources, supply chain, and marketing in the big data integrated ERP systems or cloud-based ERP systems. The main objectives of this creative research work are to identify the vulnerabilities in various ERP systems, databases, and the interconnected domains; to introduce a conceptual cybersecurity network model that incorporates variational autoencoders, sparse autoencoders, and deep belief networks; to evaluate the performance of the proposed cybersecurity model by employing the appropriate parameters with real-time and synthetic databases and simulated scenarios; and to validate the model performance by comparing it with traditional algorithms. A big data platform with an integrated business management system is known as an integrated ERP system, which plays an instrumental role in conducting business for various organizations in society. In recent times, as uncertainty and disparity increase, the cyber ecosystem becomes more complex, volatile, dynamic, and unpredictable. In particular, the number of cyber-attacks is increasing at an alarming rate; the resultant security breaches have a disruptive and disturbing effect on businesses around the world, with a loss of billions of dollars. To combat these threats, it is essential to develop a conceptual cybersecurity network model to secure systems by functioning as a mutually supporting and strengthening network model rather than working in isolation. In this dynamic and fluid environment, introducing a deep learning approach helps to support and prevent fraud and other illicit activities related to human resources and the supply chain, among others. Some cybersecurity vulnerabilities include, for example, database vulnerabilities, service level vulnerabilities, and system vulnerabilities, among others. The proposed methodology focuses only on database vulnerabilities, with the main aim of detecting and mitigating new potential vulnerabilities in other dependent domains as a future initiative.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

The Role of Neural Networks in Advancing Wearable Healthcare Technology Analytics

Abstract Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical [...] Read more.
Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical healthcare technology, crawling through some industry giants. Wearable Healthcare Technologies are becoming more popular every day. These technologies facilitate collecting, monitoring, and sharing every vital aspect of the human body necessary for diagnosing and treating an ailment. At the advent of global digitization, health data storage and systematic analysis are taking shape to ensure better diagnostics, preventive, and predictive healthcare. Healthcare analytics powered by neural networks can significantly improve health outcomes, maximizing individuals' potential and quality of life. The breadth and possibilities of connected devices are getting wider. From personal activity monitoring to quantifying every bit of health statistics, connected devices are making an impact in measurement, management, and manipulation. In healthcare, early diagnosis could be a lifesaver. Data analytics can help in a big way to make moves and predictions to save lives. We are in another phase of the digitization era, "Neural Network and Wearable Healthcare Technology Analytics." A neural network could be conceived as an adaptive system made up of a large number of neurons connected in multiple layers. A neural network processes data in a similar way as the human brain does. Using a collection of algorithms, for many neural networks, objects are composed of 'input' and 'output' layers along with the layers of the neural network.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Predictive Analytics in Biologics: Improving Production Outcomes Using Big Data

Abstract Biopharmaceuticals, or biologics, are a burgeoning sector in the pharmaceutical industry, predicted to reach $239.4 billion by 2025. This unparalleled growth is often attributed to the enhanced specificity offered by large molecules over small molecules. The large size of the constituent proteins necessitates the continuous implementation of big data predictive analytics to elucidate the most [...] Read more.
Biopharmaceuticals, or biologics, are a burgeoning sector in the pharmaceutical industry, predicted to reach $239.4 billion by 2025. This unparalleled growth is often attributed to the enhanced specificity offered by large molecules over small molecules. The large size of the constituent proteins necessitates the continuous implementation of big data predictive analytics to elucidate the most effective candidates in the lead optimization process. These same methodologies can be applied, and with the advent of machine learning and automated predictive analytics, this is becoming an increasingly facile task, to the augmentation and optimization of the downstream production processes that comprise the majority of the development cost of any biologic. In this work, big data from cell line generation, product and process design, and large-scale lead validation studies have been used to compare the applicability of simple statistical models against these black-box approaches for the rapid acceleration of enzymes to the pilot plant stage. This research can be expanded upon to exploit the big datasets generated as part of the progression of biologics through the development pipeline to further optimize production outcomes. Over the coming months, data from the project will be used to probe which approaches are amenable to which processes and, as a result, more amenable to various economic simulations. The computed optimization objective for the HIT must include the cost of acquiring, storing, and analyzing data to construct these predictive models, alongside the expected commercial reward of choosing an optimally ranked candidate. In this vein, perspective must be taken in the probable future price, capability outputs, and ownership issues of increasingly sophisticated data analysis software as superstructures become more frequent. It is frequently stated that decisions made to reduce production costs are data-driven, but that is not because more economically or energetically costly experiments or production methods are employed; to truly evaluate production steps, dynamic energy, and economic models need to become more commonplace. Conversion of process quality approaches from large questionnaires, risk analysis, and expert opinion-driven methods to statistical and thus more reliable approaches is an area of future research in analytics used herein.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

The Role of AI Driven Clinical Research in Medical Device Development: A Data Driven Approach to Regulatory Compliance and Quality Assurance

Abstract This essay explores how AI can enhance clinical research and, particularly, its pivotal role in the development of medical devices. A data-driven approach to medical device development that can streamline regulatory compliance and quality assurance is discussed. Methods that generate insights from pre-stage data and utilize it during development are detailed. The effectiveness of this approach in [...] Read more.
This essay explores how AI can enhance clinical research and, particularly, its pivotal role in the development of medical devices. A data-driven approach to medical device development that can streamline regulatory compliance and quality assurance is discussed. Methods that generate insights from pre-stage data and utilize it during development are detailed. The effectiveness of this approach in compliance audits, 510(k) submissions, and quality system audits - reducing time, effort, and risks is analyzed. The findings are illustrated with practical examples and takeaway recommendations. When reading a scientific article, how many times have you judged the quality of the research by looking at the methodology section? Artificial intelligence algorithms can be developed with the most robust and innovative technology, but if they are not properly validated, they will be worthless in the eyes of regulatory authorities. Conversely, outdated and simplistic models can still gain regulatory clearance if robustness is effectively demonstrated. For better or worse, ethics, economics, and robustness are often sacrificed in the constant government struggle to keep up with the technological edge of AI development. The slow crawl of lawmakers is constant in every field. Automating small tasks can save time and reduce risks when playing catch-up with a changing regulatory framework so the rest of the AI development can continue uninhibitedly. This dives into using FDA open data to collaborate with a food and drug law company and develop several bottom-up initiatives that supply knowledge needed for regulatory compliance and quality systems development. Methods that input pre-stage data and output actionable insights as models are provided. By sharing these resources and advice as academic researchers, efficiency in streamlining processes is maximized, thereby letting more time and resources be allocated to the actual development [1].
Figures
PreviousNext
Case Report
Open Access December 21, 2016

Advanced Natural Language Processing (NLP) Techniques for Text-Data Based Sentiment Analysis on Social Media

Abstract The field of sentiment analysis is a crucial aspect of natural language processing (NPL) and is essential in discovering the emotional undertones within the text data and, hence, capturing public sentiments over a variety of issues. In this regard, this study suggests a deep learning technique for sentiment categorization on a Twitter dataset that is based on Long Short-Term Memory (LSTM) [...] Read more.
The field of sentiment analysis is a crucial aspect of natural language processing (NPL) and is essential in discovering the emotional undertones within the text data and, hence, capturing public sentiments over a variety of issues. In this regard, this study suggests a deep learning technique for sentiment categorization on a Twitter dataset that is based on Long Short-Term Memory (LSTM) networks. Preprocessing is done comprehensively, feature extraction is done through a bag of words method, and 80-20 data is split using training and testing. The experimental findings demonstrate that the LSTM model outperforms the conventional models, such as SVM and Naïve Bayes, with an F1-score of 99.46%, accuracy of 99.13%, precision of 99.45%, and recall of 99.25%. Additionally, AUC-ROC and PR curves validate the model’s effectiveness. Although, it performs well the model consumes heavy computational resources and longer training time. In summary, the results show that deep learning performs well in sentiment analysis and can be used to social media monitoring, customer feedback evaluation, market sentiment analysis, etc.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Advancing Healthcare Innovation in 2021: Integrating AI, Digital Health Technologies, and Precision Medicine for Improved Patient Outcomes

Abstract Advances of wearables, sensors, smart devices, and electronic health records have generated patient-oriented longitudinal data sources that are analyzed with advanced analytical tools to generate enormous opportunities to understand patient health conditions and needs, transforming healthcare significantly from conventional paradigms to more patient-specific and preventive approaches. Artificial [...] Read more.
Advances of wearables, sensors, smart devices, and electronic health records have generated patient-oriented longitudinal data sources that are analyzed with advanced analytical tools to generate enormous opportunities to understand patient health conditions and needs, transforming healthcare significantly from conventional paradigms to more patient-specific and preventive approaches. Artificial intelligence (AI) with a machine learning methodology is prominently considered as it is uniquely suitable to derive predictions and recommendations from complex patient datasets. Recent studies have shown that precise data aggregation methods exhibit an important role in the precision and reliability of clinical outcome distribution models. There is an essential need to develop an effective and powerful multifunctional machine learning platform to enable healthcare professionals to comprehend challenging biomedical multifactorial datasets to understand patient-specific scenarios and to make better clinical decisions, potentially leading to the optimist patient outcomes. There is a substantial drive to develop the networking and interoperability of clinical systems, the laboratory, and public health. These steps are delivered in concert with efforts at enabling usefully analytic tools and technologies for making sense of the eruption of overall patient’s information from various sources. However, the full efficiency of this technology can only be eliminated when ethical, legal, and social challenges related to reducing the privacy of healthcare information are successfully absorbed. Public and media are to be informed about the capabilities and limitations of the technologies and the paramount to be balanced is juvenile public healthcare data privacy debate. While this is ongoing, the measures have been progressed from patient data protection abuses for progress to realize the full potential of AI technology for hosting the health system, with benefits for all stakeholders. Any protection program should be based on fairness, transparency, and a full commitment to data privacy. On-going innovative systems that use AI to manage clinical data and analyzes are proposed. These tools can be used by healthcare providers, especially in defining specific scenarios related to biomedical data management and analysis. These platforms ensure that the significant and potentially predictive parameters associated with the diagnosis, treatment, and progression of the disease have been recognized. With the systematic use of these solutions, this work can contribute to the realization of noticeable improvements in the provision of real-time, personalized, and efficient medicine at a reduced cost [1].
Figures
PreviousNext
Case Report
Open Access December 27, 2021

Revolutionizing Risk Assessment and Financial Ecosystems with Smart Automation, Secure Digital Solutions, and Advanced Analytical Frameworks

Abstract For years, risk assessment and financial calculations have been based on mathematical, statistical, and actuarial studies of existing and historical data. The manual process of building datasets, processing data, deriving trends, identifying periodicities, and analyzing diagnostics is extremely expensive and time-consuming. With the automation and evolution of data science technologies, [...] Read more.
For years, risk assessment and financial calculations have been based on mathematical, statistical, and actuarial studies of existing and historical data. The manual process of building datasets, processing data, deriving trends, identifying periodicities, and analyzing diagnostics is extremely expensive and time-consuming. With the automation and evolution of data science technologies, organizations are now bringing in niche data, such as unstructured data, which contain more disruptive and precise signals for decision-making—thereby making predictions and derivative valuations more robust. This discussion highlights how investment decision-making and financial ecosystem activities are set to be transformed with the power of technical automation, data, and artificial intelligence. A noted trend in the financial investment sector is that financial valuations are highly predictive and highly non-linear in long-term occurrences. To understand these robust evolving signals and execute profitable strategies upon them, the investment management process needs to be very dynamic, open, smart, and technically deep. However, with current manual processes, reaching a high-end asset prediction still seems like a shot in the dark. In parallel, open and democratically developed financial ecosystems query relatively riskless premium opportunities in high-finance valuation and perception. The process of evolving financial ecosystems or the use of automated tools and data to move to unique frontiers could make high-yield profiting opportunities very safe and entirely riskless. Financial economic theories and realistic approximation models support this.
Figures
PreviousNext
Review Article
Open Access December 17, 2024

An Analysis of Performance and Comparison of Models for Cardiovascular Disease Prediction via Machine Learning Models in Healthcare

Abstract Over the past few decades, cardiovascular disease and related complications have surpassed all others as the important causes of death on a universal scale. At the moment, they are the important cause of mortality universal, including in India. It is important to know how to find cardiovascular problems early so that patients get better care and prices go down. This project utilizes the UCI Heart [...] Read more.
Over the past few decades, cardiovascular disease and related complications have surpassed all others as the important causes of death on a universal scale. At the moment, they are the important cause of mortality universal, including in India. It is important to know how to find cardiovascular problems early so that patients get better care and prices go down. This project utilizes the UCI Heart Disease Dataset to develop ML and DL models capable of detecting cardiac diseases. Heart illness was categorized using Convolutional Neural Network (CNN) models, which are able to detect intricate patterns in supplied data. A confusion matrix rating, an F1-score, a ROC curve, accuracy, precision, and recall were some of the measures used to grade the model. It did much better than the Neural Network, Deep Neural Network (DNN), and Gradient Boosted Trees (GBT) models, with 91.71% accuracy, 88.88% precision, 82.75% memory, and 85.70% F1-score. Comparative study showed that CNN was the most accurate model. Other models had different balances between accuracy and recall. The experiment results show that the optional CNN model is a decent way to identify cardiovascular disease. This means that it could be used in healthcare systems to find diseases earlier and treat patients better.
Figures
PreviousNext
Article
Open Access December 19, 2024

Intelligent Detection of Injection Attacks via SQL Based on Supervised Machine Learning Models for Enhancing Web Security

Abstract The most prevalent technique behind security data breaches exists through SQL Injection Attacks. Organizations and individuals suffer from sensitive information exposure and unauthorized entry when attackers take advantage of SQL injection (SQLi) attack vulnerability’s severe risks. Static and heuristic defense methods remain conventional detection tools for previous SQL injection attacks study's [...] Read more.
The most prevalent technique behind security data breaches exists through SQL Injection Attacks. Organizations and individuals suffer from sensitive information exposure and unauthorized entry when attackers take advantage of SQL injection (SQLi) attack vulnerability’s severe risks. Static and heuristic defense methods remain conventional detection tools for previous SQL injection attacks study's foundation is a detection system developed using the Gated Recurrent Unit (GRU) network, which attempts to efficiently identify SQL Injection attacks (SQLIAs). The suggested Gated Recurrent Unit model was trained using an 80:20 train-test split, and the results showed that SQL injection attacks could be accurately identified with a precision rate of 97%, an accuracy rate of 96.65%, a recall rate of 92.5%, and an F1-score of 94%. The experimental results, together with their corresponding confusion matrix analysis and learning curves, demonstrate resilience and outstanding generalization ability. The GRU model outperforms conventional machine learning (ML) models, including K-Nearest Neighbor’s (KNN), and Support Vector Machine (SVM), in terms of identifying sequential patterns in SQL query data. Recurrent neural architecture proves effective in the detection of SQLi attacks through its ability to provide secure protection for contemporary web applications.
Figures
PreviousNext
Article
Open Access December 26, 2021

Deep Learning Applications for Computer Vision-Based Defect Detection in Car Body Paint Shops

Abstract The major automated plants have produced large volumes of high-quality products at low cost by introducing various technologies, including robotics and artificial intelligence. The code of many defects on the surface of products is embedded with economic loss and sometimes functionality loss because products are rarely found with defects. Therefore, most items’ production is done based on [...] Read more.
The major automated plants have produced large volumes of high-quality products at low cost by introducing various technologies, including robotics and artificial intelligence. The code of many defects on the surface of products is embedded with economic loss and sometimes functionality loss because products are rarely found with defects. Therefore, most items’ production is done based on prediction and has an invisible fluctuation in production. The detection process for hidden defect images requires a lot of costs and needs to be supported for better progress and quality enhancement. Paint shop defects should be analyzed from color changes to detect defects effectively by preventing the variability of product demand over time. It is not easy to take visible light images without noise because the paint surfaces are glossy. A few parts of illumination and shadows remain in images, even in larger size and high-resolution images. The various painted surfaces are also needed to reflect both color and texture information in computer vision models to classify defects precisely. Several automated detection systems have been applied to paint shop inspections using lasers, infrared, x-ray, electrical, magnetic, and acoustic sensors. The chance of paint shop defects can be low, unnecessarily low, compared to clouds in the sky, but those chances impact defect functionalities. Thus, they are called as “lessons learned.” Lately, artificial intelligence has been introduced to the field of factory automation, and many defect detection feeds have found footsteps in machine learning and deep learning. Recent attempts at deep learning-based defect detection are proposing simple techniques using specific neural network architectures with big data. However, big data is still in its early stages, and significant challenges exist in normalizing and annotating such data. To get cost-efficient and timely solutions tailored to automotive paint shops, it might be a better consideration to combine deep learning solutions with traditional computer vision and more elaborate machine learning methods.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

An Analysis of Crime Prediction and Classification Using Data Mining Techniques

Abstract Crime is a serious and widespread problem in their society, thus preventing it is essential. Assignment. A significant number of crimes are committed every day. One tool for dealing with model crime is data mining. Crimes are costly to society in many ways, and they are also a major source of frustration for its members. A major area of machine learning research is crime detection. This paper [...] Read more.
Crime is a serious and widespread problem in their society, thus preventing it is essential. Assignment. A significant number of crimes are committed every day. One tool for dealing with model crime is data mining. Crimes are costly to society in many ways, and they are also a major source of frustration for its members. A major area of machine learning research is crime detection. This paper analyzes crime prediction and classification using data mining techniques on a crime dataset spanning 2006 to 2016. This approach begins with cleaning and extracting features from raw data for data preparation. Then, machine learning and deep learning models, including RNN-LSTM, ARIMA, and Linear Regression, are applied. The performance of these models is evaluated using metrics like Root Mean Squared Error (RMSE) and Mean Absolute Percentage Error (MAPE). The RNN-LSTM model achieved the lowest RMSE of 18.42, demonstrating superior predictive accuracy among the evaluated models. Data visualization techniques further unveiled crime patterns, offering actionable insights to prevent crime.
Figures
PreviousNext
Article
Open Access December 27, 2020

Enhancing Regulatory Compliance in Finance through Big Data Analytics and AI Automation

Abstract This paper shows how Big Data Analytics (BDA) and Artificial Intelligence (AI) automation facilitate regulatory compliance in Finance. Regulatory compliance is essential in helping institutions to mitigate reputational, litigation, and financial risk. Existing literature reveals several preconditions for compliance. However, much of the literature has adopted an internal view of compliance without [...] Read more.
This paper shows how Big Data Analytics (BDA) and Artificial Intelligence (AI) automation facilitate regulatory compliance in Finance. Regulatory compliance is essential in helping institutions to mitigate reputational, litigation, and financial risk. Existing literature reveals several preconditions for compliance. However, much of the literature has adopted an internal view of compliance without considering external regulatory frameworks. This research draws on the cognitive model of regulation that looks at regulatory compliance as a social construct. It uses a triangulation research method comprising literature review, interview of trade compliance experts, and questionnaire survey of compliance practitioners to understand how regulation affects compliance and what role ICTs play in implementing compliance. The findings of this study present a regulatory compliance framework comprising four cognitive stages and a conceptual regulatory compliance system that presents how BDA and AI automation are applied to mitigate regulatory complexity and enhance regulatory compliance. The conceptual regulatory compliance system shows how BDA and AI enable institutions to dynamically assess regulatory risk, automatically monitor compliance, and intelligently predict risk violations mitigating regulatory complexity and preventing producing unnecessary documents. It provides theoretical contributions to understanding regulatory evolution and compliance and practical implications for understanding how regulation evolves to be more complicated and elements of a regulatory compliance system mitigate proliferating regulations. Additionally, it provides avenues for future research into the relationship between competing regulatory mandates and how institutions cope with that. Regulations are important for ensuring compliance and governance in finance and to curb systemic risk. Complying with regulations is difficult due to their growing volume, complexity, and fragmentation. Institutions use large-scale Information and Communication Technologies (ICTs), such as Big Data Analytics (BDA) and Artificial Intelligence (AI) automation, to monitor compliance and mitigate regulatory complexity. However, less is known about how firms comply with regulation. Most literature does not thoroughly investigate regulatory elements nor explicitly relate them to compliance.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Optimizing Unclaimed Property Management through Cloud-Enabled AI and Integrated IT Infrastructures

Abstract With unclaimed property assets reaching record levels, businesses have become, in some cases, overwhelmed and hamstrung by stagnant, unoptimized processes. That sentiment is compounded by ever-evolving regulatory changes, resulting in organizations struggling to hit compliance deadlines while delivering an optimal claimant experience. Often, early systems had periods of short-term success but are [...] Read more.
With unclaimed property assets reaching record levels, businesses have become, in some cases, overwhelmed and hamstrung by stagnant, unoptimized processes. That sentiment is compounded by ever-evolving regulatory changes, resulting in organizations struggling to hit compliance deadlines while delivering an optimal claimant experience. Often, early systems had periods of short-term success but are on the verge of obsolescence, resulting in stressed workflows and cumbersome integrations. Deploying an integrated IT infrastructure, supported by cloud-enabled AI, represents the quickest path to modernizing unclaimed property management. A fully integrated IT infrastructure is crucial to optimize the management of unclaimed property [1]. When lone solutions exist across an organization, companies miss out on automation opportunities generated through the interconnectedness of systems and data. AI presents organizations with the opportunity to traverse these gaps, enabling a vast library of applications to improve the perturbed workflows of unclaimed property teams. Automated data extraction, document comparison, fraudulent claim detection, and workflow completion analysis are just a few popular applications well suited for the unclaimed property space. In addition to the lagging technology currently deployed by many organizations, the unclaimed property landscape itself is evolving. Compliance issuance, asset availability, rates, the ability to collect fraudulently posted claims, and the claimant experience have all become hot-button items that are now front of mind for regulation agencies and businesses alike. Issuing duplication letters in a compliant manner, accommodating claimant inquiries regarding held assets, and managing, processing, and understanding the operational impact of rate changes are vexing problems many organizations now find themselves playing catch-up to address. The opportunity posed by cloud-enabled AI is furthered by economic, regulatory, and report cycle pressures on unclaimed property teams to do more with the same size or fewer resources. It’s now no longer simply a case of hitting the audit date deadline and checking off a box but an emerging priority for businesses at all sides of the market, from Fortune 500 to mid-market firms. In-house shared service teams are comfortable in areas of monitoring and curating business data; however, unclaimed property is an unknown territory with a learning curve, compliance gaps, and operational holes that, if ignored, stand to scale up exponentially. The combined fallout from regulatory changes and the recent pandemic have only made the situation riskier, with increased volatility in balancing time-sensitive tasks against stringent regulatory deadlines and growing claimant outreach.
Figures
PreviousNext
Review Article
Open Access December 29, 2020

Enhancing Government Fiscal Impact Analysis with Integrated Big Data and Cloud-Based Analytics Platforms

Abstract While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that [...] Read more.
While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that facilitates data retrieval and analytics, as well as policy modelling, creation and optimization. The environment enables data collection from heterogeneous sources, linking and aggregation, complemented with data cleaning and interoperability techniques. An innovative approach for analytics as a service is introduced and linked with a policy development toolkit, which is an integrated web-based environment to fulfil the requirements of the public policy ecosystem stakeholders [1]. Large information databases on various public issues exist, but their usage for public policy formulation and impact analysis has been limited so far, as no cloud-based service ecosystem exists to facilitate their efficient exploitation. With the increasing availability and importance of both public big and traditional data, the need to extract, link and utilize such information efficiently has arisen. Current data-driven web technologies and models are not aligned with the needs of this domain, and therefore, potential candidates for big data, cloud-based and service-oriented public policy analysis solutions should be investigated, piloted and demonstrated [2]. This paper presents the conceptual architecture of such an ecosystem based on the capabilities of state-of-the-art cloud and web technologies, as well as the requirements of its users.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

Advance of AI-Based Predictive Models for Diagnosis of Alzheimer's Disease (AD) in Healthcare

Abstract The effects on the elderly are disproportionately Alzheimer’s disease (AD) is one of the most prevalent and chronic types of dementia. Alzheimer's disease (AD), a fatal illness that can harm brain structures and cells long before symptoms appear, is currently incurable and incurable. Using brain MRI pictures from a publicly accessible Kaggle dataset, this study suggests a prediction model based [...] Read more.
The effects on the elderly are disproportionately Alzheimer’s disease (AD) is one of the most prevalent and chronic types of dementia. Alzheimer's disease (AD), a fatal illness that can harm brain structures and cells long before symptoms appear, is currently incurable and incurable. Using brain MRI pictures from a publicly accessible Kaggle dataset, this study suggests a prediction model based on Convolutional Neural Networks (CNNs) to help with the early detection of Alzheimer's disease. Four levels of dementia have been applied to the 6,400 photos in the collection: not demented, slightly demented, moderately demented, and considerably mildly demented. Pixel normalization, class balancing utilizing data augmentation techniques, and picture scaling to 128×128 pixels were all part of a thorough workflow for data preparation. To improve the gathering of spatial dependence in volumetric MRI data, a 3D convolutional neural network (CNN) architecture was used. We used important performance measures including F1-score, recall, accuracy, precision, and log loss to gauge the model's effectiveness. A review of the available data indicates that the total F1-score, accuracy, recall, and precision were 99.0%, 99.0%, and 99.38%, respectively. The findings demonstrate the model's potential for practical use in early AD diagnosis and establish its robustness with the help of confusion matrix analysis and performance curves.
Figures
PreviousNext
Article
Open Access December 27, 2022

Big Data-Driven Time Series Forecasting for Financial Market Prediction: Deep Learning Models

Abstract Financial markets have become more and more complex, so has been the number of data sources. Stock price prediction has hence become a tough but important task. The time dependencies in stock price movements tend to escape from traditional models. In this work, a hybrid ARIMA-LSTM model is suggested to enhance accuracy of stock price forecasts. Based on time series indicators like adjusted closing [...] Read more.
Financial markets have become more and more complex, so has been the number of data sources. Stock price prediction has hence become a tough but important task. The time dependencies in stock price movements tend to escape from traditional models. In this work, a hybrid ARIMA-LSTM model is suggested to enhance accuracy of stock price forecasts. Based on time series indicators like adjusted closing prices of S&P 500 stocks over a decade (2010–2019), the ARIMA-LSTM model combines influences of both autoregressive time series forecasting with the substantial sequence learning property of LSTM. Data preprocessing in all aspects including missing values interpolation, outlier’s detection and data scaling – Min-Max guarantees data quality. The model is trained on 90/10 training/testing split and met with main performance metrics: MaE, MSE & RMSE. As indicated in the results, the proposed ARIMA-LSTM model gives a MAE value and MSE value of 0.248 and 0.101 respectively and RMSE of 0.319, a measure high accuracy on stock price prediction. Coupled comparative analysis with other Artificial Neural Networks (ANN) and BP Neural Networks (BPNN) are examples of machine learning reference models, further illustrates the suitability and superiority of ARIMA-LSTM approach as compared to the underlying models with the least MAE and strong predictive capability. This work demonstrates the efficiency of integrating the classical time series models with deep learning methods for financial forecasting.
Figures
PreviousNext
Article
Open Access December 27, 2022

Towards the Efficient Management of Cloud Resource Allocation: A Framework Based on Machine Learning

Abstract In the constantly evolving world of cloud computing, appropriate resource allocation is essential for both keeping costs down and ensuring an ongoing flow of apps and services. Because of its adaptability to specific tasks and human behavior, machine learning (ML) is a desirable choice for fulfilling those needs. This study Efficient cloud resource allocation is critical for optimizing performance [...] Read more.
In the constantly evolving world of cloud computing, appropriate resource allocation is essential for both keeping costs down and ensuring an ongoing flow of apps and services. Because of its adaptability to specific tasks and human behavior, machine learning (ML) is a desirable choice for fulfilling those needs. This study Efficient cloud resource allocation is critical for optimizing performance and cost in cloud computing environments. In order to improve the precision of resource allocation, this study investigates the use of Long Short-Term Memory (LSTM). The LSTM model achieved 97% accuracy, 97.5% precision, 98% recall, and a 97.8% F1-score (F1-score: harmonic mean of precision and recall), according to experimental data. The confusion matrix demonstrates strong classification performance across several resource classes, while the accuracy and loss curves verify steady learning with minimal overfitting. The suggested LSTM model performs better than more conventional ML (machine learning) models like Gradient Boosting (GB) and Logistic Regression (LR), according to a comparative study. These findings underscore the LSTM (Long Short-Term Memory) model’s robustness and suitability for dynamic cloud environments, enabling more accurate forecasting and efficient resource management.
Figures
PreviousNext
Article
Open Access November 24, 2022

Bridging Traditional ETL Pipelines with AI Enhanced Data Workflows: Foundations of Intelligent Automation in Data Engineering

Abstract Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data [...] Read more.
Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data Engineering and Automation framework offers the groundwork for intelligent automation processes. However, ML/AI are not the only disruptive forces; new Big Data technologies inspired by Web2.0 companies are also reshaping the Internet. Companies having the largest Big Data footprints not only provide applications with a Big Data operational model but also source their competitive advantage from data in the form of AI services and, consequently, impact the cost/performance equilibrium of ETL pipelines. All these technologies and reasons help explain why the traditional ETL pipeline design should adapt to current and emerging technologies and may be enhanced through artificial intelligence.
Figures
PreviousNext
Article
Open Access December 21, 2021

Optimizing Data Warehousing for Large Scale Policy Management Using Advanced ETL Frameworks

Abstract Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the [...] Read more.
Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the need for data warehousing. Next, an overview of an ETL framework is presented, along with a discussion of advanced ETL techniques. The chapter concludes with an outline of performance optimization techniques for data warehousing. Data warehousing is considered a key enabler for efficient reporting and analysis, with implementation choices ranging from cost-effective desktop systems to large-scale, mission-critical data marts and warehouses containing petabytes of data. Extract, transform, and load (ETL) systems remain one of the largest cost and effort areas within data warehouse development projects, requiring significant planning and resources to build, manage, and monitor the flow of data from source systems into the data warehouse. The technology and techniques used for ETL can greatly influence the success or failure of a data warehouse. Complex business requirements for data cleansing, loading, transformation, and integration have intensified, while operational plans for real-time and near-real-time reporting add additional challenges. Parallel loading mechanisms, incremental data loading, and runtime update and insert strategies not only improve ETL performance but also optimize data warehousing performance, particularly for large-scale policy management.
Figures
PreviousNext
Article
Open Access December 22, 2020

Cloud Migration Strategies for High-Volume Financial Messaging Systems

Abstract Key business objectives for digital infrastructure cloud adoption are often framed in terms of reducing cost, improving fault tolerance and resilience, simplifying scale, and enabling innovation. Given the critical nature of the financial sector, however, where timeliness and price can significantly determine an outcome, cloud migration in delivery environments demands greater throughput on the [...] Read more.
Key business objectives for digital infrastructure cloud adoption are often framed in terms of reducing cost, improving fault tolerance and resilience, simplifying scale, and enabling innovation. Given the critical nature of the financial sector, however, where timeliness and price can significantly determine an outcome, cloud migration in delivery environments demands greater throughput on the critical path and, in many enterprise-scale settings, forgoes hybrid complexity and multi-cloud risks. Nevertheless, slack in system designs does exist; financial institutions enable market functionality—trading, clearing/best execution—despite potentially being able to meet such sets with lower service levels than other verticals. A cloud multi-account structure for sensitive data, for example, naturally limits exposure when combined with observed risk. Fulfilling predictions of elasticity during periods of high demand usually requires support from a dedicated environment (or environments) located nearer to the operations. Components can consequently be allocated on a per-account basis or maintained as shared sink systems to which the dedicated streams write. The automation code can similarly be targeted for dedicated accounts, avoiding the resource constraints that beset such operations during industry events like emergency triage/contact desking.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

Survey of Automated Testing Frameworks and Tools for Software Quality Assurance: Challenges and Best Practices

Abstract Automated testing and software quality assurance (SQA) practices are essential for ensuring the reliability, scalability, and maintainability of modern software systems. This paper presents a review of widely used automated testing frameworks, including Driven, Data-Driven, Behavior-Driven Development (BDD), and Record/Playback approaches, outlining their methodologies, benefits, and limitations [...] Read more.
Automated testing and software quality assurance (SQA) practices are essential for ensuring the reliability, scalability, and maintainability of modern software systems. This paper presents a review of widely used automated testing frameworks, including Driven, Data-Driven, Behavior-Driven Development (BDD), and Record/Playback approaches, outlining their methodologies, benefits, and limitations in different development contexts. In parallel, it examines established SQA techniques such as Test-Driven Development, static analysis, and white-box testing, which provide systematic methods for defect detection and quality improvement. The study further examines the role of practical tools, such as Selenium, TestNG, and JUnit, in supporting test automation and validation activities. In addition to highlighting technical capabilities, the paper identifies common challenges faced in automation, including incomplete requirements, integration complexities, and maintaining evolving test suites. Recommended best practices are provided to address these issues, offering guidance for organizations seeking to strengthen their software testing processes through structured frameworks, adaptive techniques, and reliable automation tools.
Figures
PreviousNext
Article
Open Access July 20, 2021

Quality of Experience (QoE) and Network Performance Modelling for Multimedia Traffic

Abstract This research explores the complex relationship between user-perceived Quality of Experience (QoE) and underlying network performance for multimedia traffic. As video streaming, online gaming, and interactive media dominate modern networks, ensuring consistent QoE has become a key challenge. The study develops a network performance model that integrates objective Quality of Service (QoS) [...] Read more.
This research explores the complex relationship between user-perceived Quality of Experience (QoE) and underlying network performance for multimedia traffic. As video streaming, online gaming, and interactive media dominate modern networks, ensuring consistent QoE has become a key challenge. The study develops a network performance model that integrates objective Quality of Service (QoS) parameters—such as delay, jitter, packet loss, and throughput—with subjective QoE metrics like Mean Opinion Score (MOS) and perceptual quality indices. Using simulation-based and analytical approaches, the paper evaluates how network conditions affect multimedia traffic behavior and user satisfaction. The results highlight critical thresholds for QoE degradation, enabling predictive modeling for adaptive multimedia delivery and real-time optimization. This work contributes to designing intelligent, user-centered network management systems capable of balancing resource efficiency and end-user satisfaction.
Figures
PreviousNext
Review Article
Open Access December 20, 2024

AI for Time Series and Anomaly Detection

Abstract Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent [...] Read more.
Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent advances in artificial intelligence particularly deep learning architectures such as recurrent neural networks (RNNs), convolutional neural networks (CNNs), temporal convolutional networks (TCNs), graph neural networks (GNNs) and Transformers have demonstrated marked improvements in modeling both univariate and multivariate series, as well as in detecting anomalies that deviate from learned norms (Darban, Webb, Pan, Aggarwal, & Salehi, 2022; Chiranjeevi, Ramya, Balaji, Shashank, & Reddy, 2024) [1,2]. Moreover, ensemble techniques and hybrid signal-processing + deep-learning pipelines show enhanced sensitivity and adaptability in real-world anomaly detection scenarios (Iqbal, Amin, Alsubaei, & Alzahrani, 2024) [3]. In this work, we provide a unified survey and comparative analysis of AI-driven time series forecasting and anomaly detection methods, highlight key industrial application domains, evaluate performance trade-offs (e.g., accuracy vs. latency, supervised vs. unsupervised learning), and discuss emerging challenges including interpretability, data drift, real-time deployment on edge devices, and integration of causal reasoning. Our findings suggest that while AI approaches significantly outperform classical techniques in many settings, careful consideration of data characteristics, evaluation metrics and deployment environment remains essential for effective adoption.
Article
Open Access December 27, 2021

Best Practices of CI/CD Adoption in Java Cloud Environments: A Review

Abstract The continuous integration (CI) and continuous delivery/deployment (CD) methods are key tools in the field of modern software development, and they assist in the rapid, reliable and quality delivery of software. These DevOps methods are automated, and the code development, testing, and deployment processes are streamlined, which reduces the risk of integration, enhances productivity, and minimizes [...] Read more.
The continuous integration (CI) and continuous delivery/deployment (CD) methods are key tools in the field of modern software development, and they assist in the rapid, reliable and quality delivery of software. These DevOps methods are automated, and the code development, testing, and deployment processes are streamlined, which reduces the risk of integration, enhances productivity, and minimizes human labor. To implement CI/CD, Java cloud applications can utilize cloud-native services such as AWS Code Pipeline, Azure DevOps, and Google Cloud Build, as well as tools like Jenkins, GitLab CI/CD, GitHub Actions, CircleCI, Travis CI, and Bamboo. Basic concepts of CI/CD include automation, regular integration, testing, intensive testing, constant feedback, and process improvement. Some of the major pipeline phases include deployment, monitoring, testing, artefact management, build automation, and source code management. Despite clear benefits, challenges remain, including infrastructure complexity, dependency management, test reliability, and cultural barriers, particularly in large-scale or enterprise Java projects. This work provides a thorough analysis of CI/CD procedures and resources, including frameworks, best practices, and challenges for Java cloud applications. It highlights strategies to optimize adoption, improve software quality, and accelerate delivery cycles.
Figures
PreviousNext
Review Article
Open Access December 27, 2023

MLOps Frameworks for Reliable Model Deployment in Cloud Data Platforms

Abstract Machine learning operations (MLOps) comprises the practices, methods, and tooling that facilitate the deployment of reliable ML models in production environments. While many aspects of cloud data platforms are designed to enable reliability, only some managed ML services support the MLOps goals of continuous integration, continuous delivery, data lineage tracking, associated reproducibility, [...] Read more.
Machine learning operations (MLOps) comprises the practices, methods, and tooling that facilitate the deployment of reliable ML models in production environments. While many aspects of cloud data platforms are designed to enable reliability, only some managed ML services support the MLOps goals of continuous integration, continuous delivery, data lineage tracking, associated reproducibility, governance, and security. Furthermore, reliability encompasses not only the fulfillment of service-level objectives, but also systematic monitoring, alerting, and incident response automation. Architectural patterns are proposed to enable reliable deployment in cloud data platforms, focusing on the implementation of continuous integration and testing pipelines for ML models and the formulation of continuous delivery and rollout strategies. Continuous integration pipelines reduce the risk of regressions and ensure sufficient model performance at the time of deployment, while continuous delivery pipelines enable rapid updates to production models within acceptable risk profiles. The landscape of publicly available MLOps frameworks, tools, and services is also examined, emphasizing the pros and cons of established and rising solutions in containerization, orchestration, model serving, and inference. Containerization and orchestration contributes to the building of reliable deployment pipelines in cloud data platforms, whether general-purpose tools (e.g. Docker and Kubernetes) or solutions tailored for ML workloads. Containerized serving frameworks designed for high-throughput, low-latency inference can benefit a wide range of business applications, while auto-scaling and model versioning capabilities enhance the ease of use of cloud-native ML services.
Figures
PreviousNext
Review Article
Open Access June 28, 2016

Scalable Task Scheduling in Cloud Computing Environments Using Swarm Intelligence-Based Optimization Algorithms

Abstract Effective task scheduling in cloud computing is crucial for optimizing system performance and resource utilization. Traditional scheduling methods often struggle to adapt to the dynamic and complex nature of cloud environments, where workloads, resource availability, and task requirements constantly change. Swarm intelligence-based optimization algorithms, such as Particle Swarm Optimization [...] Read more.
Effective task scheduling in cloud computing is crucial for optimizing system performance and resource utilization. Traditional scheduling methods often struggle to adapt to the dynamic and complex nature of cloud environments, where workloads, resource availability, and task requirements constantly change. Swarm intelligence-based optimization algorithms, such as Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), and Artificial Bee Colony (ABC), offer a promising solution by mimicking natural processes to explore large search spaces efficiently. These algorithms are effective in balancing multiple objectives, including minimizing execution time, reducing energy consumption, and ensuring fairness in resource allocation. They also enhance system scalability, which is vital for modern cloud infrastructures. However, challenges remain, including slow convergence speeds, complex parameter tuning, and integration with existing cloud frameworks. Addressing these issues will be essential for the practical implementation of swarm intelligence in cloud task scheduling, helping to improve resource management and overall system performance.
Figures
PreviousNext
Review Article

Query parameters

Keyword:  Artificial intelligence

View options

Citations of

Views of

Downloads of