Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access February 06, 2026

Predictive Modeling of Public Sentiment Using Social Media Data and Natural Language Processing Techniques

Abstract Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled [...] Read more.
Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled tweets, and develops predictive models for binary sentiment classification using Naive Bayes, Logistic Regression, and the transformer-based BERT model. Experiments were conducted on a balanced subset of 12,000 tweets after comprehensive NLP preprocessing. Evaluation using accuracy, F1-score, and confusion matrices revealed that BERT significantly outperforms traditional models, achieving an accuracy of 89.5% and an F1-score of 0.89 by effectively modeling contextual and semantic nuances. In contrast, Naive Bayes and Logistic Regression demonstrated reasonable but consistently lower performance. To support practical deployment, we introduce SentiFeel, an interactive tool enabling real-time sentiment analysis. While resource constraints limited the dataset size and training epochs, future work will explore full corpus utilization and the inclusion of neutral sentiment classes. These findings underscore the potential of transformer models for enhanced public opinion monitoring, marketing analytics, and policy forecasting.
Figures
PreviousNext
Article
Open Access January 11, 2025

Exploring LiDAR Applications for Urban Feature Detection: Leveraging AI for Enhanced Feature Extraction from LiDAR Data

Abstract The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is [...] Read more.
The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is crucial for enhancing urban development, environmental monitoring, and advancing smart city governance. LiDAR, known for its high-resolution 3D data capture capabilities, paired with AI, particularly deep learning algorithms, facilitates advanced analysis and interpretation of urban areas. This combination supports precise mapping, real-time monitoring, and predictive modeling of urban growth and infrastructure. For instance, AI can process LiDAR data to identify patterns and anomalies, aiding in traffic management, environmental oversight, and infrastructure maintenance. These advancements not only improve urban living conditions but also contribute to sustainable development by optimizing resource use and reducing environmental impacts. Furthermore, AI-enhanced LiDAR is pivotal in advancing autonomous navigation and sophisticated spatial analysis, marking a significant step forward in urban management and evaluation. The reviewed paper highlights the geometric properties of LiDAR data, derived from spatial point positioning, and underscores the effectiveness of machine learning algorithms in object extraction from point clouds. The study also covers concepts related to LiDAR imaging, feature selection methods, and the identification of outliers in LiDAR point clouds. Findings demonstrate that AI algorithms, especially deep learning models, excel in analyzing high-resolution 3D LiDAR data for accurate urban feature identification and classification. These models leverage extensive datasets to detect patterns and anomalies, improving the detection of buildings, roads, vegetation, and other elements. Automating feature extraction with AI minimizes the need for manual analysis, thereby enhancing urban planning and management efficiency. Additionally, AI methods continually improve with more data, leading to increasingly precise feature detection. The results indicate that the pulse emitted by continuous wave LiDAR sensors changes when encountering obstacles, causing discrepancies in measured physical parameters.
Figures
PreviousNext
Article
Open Access January 10, 2025

Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence

Abstract Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a [...] Read more.
Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS's transformative potential across diverse computational fields.
Figures
PreviousNext
Article
Open Access February 17, 2024

An Overview of Short- and Long-Term Adverse Outcomes and Complications of Perinatal Depression on Mother and Offspring

Abstract Antenatal and postpartum major depressive episode (MDE) according to Diagnostic and Statistical Manual of Mental Disorders 5th Edition (DSM-V) is defined as either daily sustained sad mood or lack of enjoyment or desire for a minimum two weeks plus four associated manifestations (only three if the two major symptoms are present) that start throughout pregnancy or during the first 4 weeks [...] Read more.
Antenatal and postpartum major depressive episode (MDE) according to Diagnostic and Statistical Manual of Mental Disorders 5th Edition (DSM-V) is defined as either daily sustained sad mood or lack of enjoyment or desire for a minimum two weeks plus four associated manifestations (only three if the two major symptoms are present) that start throughout pregnancy or during the first 4 weeks postpartum respectively: 1) Unintentional notable slimming up or down; 2) Sleepiness or sleeplessness; 3) Tiredness sensation; 4) Guilty or futility sensation; 5) Declined concentration capacity; 6) Frequent suicidal thoughts; 7) Psychomotor excitation or delay. Perinatal depression carries vital and adverse consequences on mother’s psychosocial aspects of life, pregnancy and delivery outcomes, her interrelations specifically with the new born with poorer overall health and influences negatively on offspring from the intrauterine life passing by complicated delivery experiencing hard unstable childhood reaching unhealthy adolescence and adulthood. These negative consequences necessitate a great attention for prevention, screening and prompt treatment for antenatal and postnatal depression to prevent such disastrous effects.
Brief Review
Open Access November 30, 2022

A Review of Application of LiDAR and Geospatial Modeling for Detection of Buildings Using Artificial Intelligence Approaches

Abstract Today, the presentation of a three-dimensional model of real-world features is very important and widely used and has attracted the attention of researchers in various fields, including surveying and spatial information systems, and those interested in the three-dimensional reconstruction of buildings. The building is the key part of the information in a three-dimensional city model, so extracting [...] Read more.
Today, the presentation of a three-dimensional model of real-world features is very important and widely used and has attracted the attention of researchers in various fields, including surveying and spatial information systems, and those interested in the three-dimensional reconstruction of buildings. The building is the key part of the information in a three-dimensional city model, so extracting and modeling buildings from remote sensing data is an important step in building a digital model of a city. LiDAR technology due to its ability to map in all three modes of one-dimensional, two-dimensional, and three-dimensional is a suitable solution to provide hyperspectral and comprehensive images of the building in an urban environment. In this review article, a comprehensive review of the methods used in identifying buildings from the past to the present and appropriate solutions for the future is discussed.
Figures
PreviousNext
Review Article
Open Access November 29, 2022

The Application of Machine Learning in the Corona Era, With an Emphasis on Economic Concepts and Sustainable Development Goals

Abstract The aim of this article is to examine the impacts of Coronavirus Disease -19 (Covid-19) vaccines on economic condition and sustainable development goals. In other words, we are going to study the economic condition during Covid19. We have studied the economic costs of pandemic, benefits in terms of gross domestic product (GDP), public finances and employment, investment on vaccines around the [...] Read more.
The aim of this article is to examine the impacts of Coronavirus Disease -19 (Covid-19) vaccines on economic condition and sustainable development goals. In other words, we are going to study the economic condition during Covid19. We have studied the economic costs of pandemic, benefits in terms of gross domestic product (GDP), public finances and employment, investment on vaccines around the world, progress and totally the economic impacts of vaccines and the impacts of emerging markets (EM) on achieving sustainable development goals (SDGs), including no poverty, good health and well-being, zero hunger, reduced inequality etc. The importance of emerging economies in reducing the harmful effects of the Corona has also been noted. We have tried to do experimental results and forecast daily new death cases from Feb-2020 to Aug-2021 in Iran using Artificial Neural Network (ANN) and Beetle Antennae Search (BAS) algorithm as a case study with econometric models and regression analysis. The findings show that Covid19 has had devastating economic and health effects on the world, and the vaccine can be very helpful in eliminating these effects specially in long-term. We observed that there is inequality in the distribution of Corona vaccines in rich countries compared to poor which EM can decrease the gap between them. The results show that both models (i.e., Artificial intelligence (AI) and econometric models) almost have the same results but AI optimization models can robust the model and prediction. The main contribution of this article is that we have surveyed the impacts of vaccination from socio-economic viewpoint not just report some facts and truth. We have surveyed the impacts of vaccines on sustainable development goals and the role of EM in achieving SDGs. In addition to using the theoretical framework, we have also used quantitative and empirical results that have rarely been seen in other articles.
Figures
PreviousNext
Article
Open Access June 28, 2025

Development of a Hemodialysis Data Collection and Clinical Information System and Establishment of an Intradialytic Blood Pressure/Pulse Rate Predictive Model

Abstract This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models [...] Read more.
This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models, with subsequent suggestions provided. Both objectives were executed under the supervision of the Institutional Review Board (IRB) at Mackay Memorial Hospital in Taiwan. The system completed for objective one has introduced three significant services to the clinic, including automated hemodialysis data collection, digitized data storage, and an information-rich human-machine interface as well as graphical data displays, which replaces traditional paper-based clinical administrative operations, thereby enhancing healthcare efficiency. The graphical data presented through web and app interfaces aids in real-time, intuitive comprehension of the patients’ conditions during hemodialysis. Moreover, the data stored in the backend database is available for physicians to conduct relevant analyses, unearth insights into medical practices, and provide precise medical care for individual patients. The training and evaluation of the predictive models for objective two, along with related comparisons, analyses, and recommendations, suggest that in situations with limited computational resources and data, an Artificial Neural Network (ANN) model with six hidden layers, SELU activation function, and a focus on artery-related features can be employed for hourly intradialytic BP/PR prediction tasks. It is believed that this contributes to the collaborating clinic and relevant research communities.
Figures
Figure 3 (c)
Figure 3 (d)
Figure 4 (b)
Figure 4 (c)
Figure 4 (d)
Figure 4 (e)
Figure 4 (f)
Figure 4 (g)
Figure 4 (h)
Figure 5 (b)
Figure 6 (b)
Figure 6 (c)
Figure 6 (d)
Figure 6 (e)
Figure 6 (f)
Figure 7 (b)
Figure 7 (c)
Figure 7 (d)
Figure 7 (e)
Figure 7 (f)
Figure 7 (g)
Figure 8 (b)
Figure 8 (c)
Figure 8 (d)
Figure 9 (b)
Figure 9 (c)
Figure 9 (d)
Figure 10 (b)
Figure 10 (c)
Figure 10 (d)
Figure 10 (e)
Figure 10 (f)
Figure 11 (b)
Figure 11 (c)
Figure 11 (d)
Figure 11 (e)
PreviousNext
PDF Html Xml
Article
Open Access April 10, 2025

Advancements in Pharmaceutical IT: Transforming the Industry with ERP Systems

Abstract The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data [...] Read more.
The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data integration, contributing significantly to operational efficiency and organizational agility. This paper explores the evolution and impact of ERP systems within the pharmaceutical sector, highlighting their contributions to overcoming the industry’s inherent challenges, including complex regulatory requirements, the need for accurate and real-time data, and the demand for supply chain resilience. The integration of cloud-based ERP solutions, the incorporation of emerging technologies like Artificial Intelligence (AI), Machine Learning (ML), and the Internet of Things (IoT), and enhanced data analytics capabilities have revolutionized pharmaceutical IT. These advancements not only reduce operational costs, improve forecasting accuracy, and enhance collaboration but also ensure compliance with stringent global regulations, such as Good Manufacturing Practices (GMP) and FDA guidelines. Moreover, ERP systems have been instrumental in managing the pharmaceutical supply chain, ensuring product traceability, and improving inventory control and order fulfillment processes. This manuscript examines how ERP systems enable pharmaceutical companies to maintain high standards of product quality, improve decision-making, and ensure the safety and efficacy of drugs through robust tracking and auditing mechanisms. A case study of a pharmaceutical company that implemented an ERP system demonstrates the tangible benefits, including increased operational efficiency, improved compliance rates, and enhanced customer satisfaction. However, despite the clear advantages, challenges such as customization complexities, data integration issues, and resistance to change remain. As the pharmaceutical industry continues to evolve, ERP systems will remain a cornerstone of digital transformation, facilitating smarter decision-making, better resource management, and enhanced collaboration across global operations. This paper also identifies future trends, including the potential of AI and blockchain technologies in further strengthening ERP systems and transforming the pharmaceutical landscape.
Review Article
Open Access March 22, 2025

Enhancing Scalability and Performance in Analytics Data Acquisition through Spark Parallelism

Abstract Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale [...] Read more.
Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale REST API calls, enabling enhanced scalability and improved processing speeds to meet the demands of high volume data workflows.
Figures
PreviousNext
Review Article
Open Access February 26, 2025

Innovations and Challenges in Pharmaceutical Supply Chain, Serialization and Regulatory Landscape

Abstract The pharmaceutical supply chain has become increasingly complex and vulnerable to various risks, including counterfeit drugs, diversion, and fraud. As these challenges threaten patient safety and the integrity of global healthcare systems, serialization has emerged as a pivotal innovation in pharmaceutical logistics and regulatory compliance. Serialization involves assigning unique identifiers to [...] Read more.
The pharmaceutical supply chain has become increasingly complex and vulnerable to various risks, including counterfeit drugs, diversion, and fraud. As these challenges threaten patient safety and the integrity of global healthcare systems, serialization has emerged as a pivotal innovation in pharmaceutical logistics and regulatory compliance. Serialization involves assigning unique identifiers to individual drug packages, enabling precise tracking and authentication at every stage of the supply chain. This process provides unprecedented transparency, enhances product security, and facilitates real-time monitoring of pharmaceutical products as they move from manufacturers to end consumers. Despite its potential to revolutionize pharmaceutical traceability, the integration of serialization technologies faces numerous obstacles. These include high implementation costs, regulatory inconsistencies across regions, and the technological challenges of managing vast amounts of data. Moreover, the complex, multi-tiered nature of the global supply chain introduces additional risks related to data integrity, cybersecurity, and interoperability between systems. As pharmaceutical companies seek to navigate these challenges, innovations in serialization technology—such as blockchain, artificial intelligence (AI), the Internet of Things (IoT), and radio frequency identification (RFID)—are providing promising solutions to enhance efficiency, reduce fraud, and increase visibility. This manuscript explores both the innovative advancements and the key challenges associated with the integration of serialization in the pharmaceutical supply chain. It delves into the evolving regulatory landscape, highlighting the need for global harmonization of serialization standards, and examines the impact of serialization on securing pharmaceutical distribution networks. Additionally, the paper emphasizes the importance of collaboration among manufacturers, technology providers, and regulatory bodies in overcoming implementation barriers and realizing the full potential of serialization. As the pharmaceutical industry moves towards a more interconnected and data-driven future, serialization promises to play a central role in shaping the next generation of drug safety and supply chain management. By addressing the hurdles to adoption and leveraging emerging technologies, the pharmaceutical sector can create a more secure, transparent, and efficient supply chain that better serves public health and fosters greater trust among consumers and healthcare professionals alike.
Review Article
Open Access February 09, 2025

The Future of Longevity Medicine from the Lens of Digital Therapeutics

Abstract Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements [...] Read more.
Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements in DTx is the integration of artificial intelligence (AI) and machine learning (ML) to tailor interventions based on individual health data. This personalization enhances the effectiveness of treatments and supports preventive care by identifying risk factors early. The need for digital therapeutics is underscored by the rising prevalence of NCDs, which are responsible for a significant portion of global mortality and healthcare costs. Traditional healthcare systems often struggle to provide timely and personalized care, especially in low-resource settings. DTx can bridge this gap by offering cost-effective solutions that are easily scalable. Moreover, digital therapeutics can address health inequities by providing low-cost interventions to underserved populations, thereby reducing the burden of NCDs and improving overall health outcomes. As technology continues to evolve, the potential for DTx to enhance longevity and quality of life becomes increasingly promising. Recent advancements in longevity medicine and technology have focused on extending both lifespan and healthspan, ensuring that people not only live longer but also maintain good health throughout their extended years. This review article highlights these advancements that are contributing to this compelling subject of Longevity.
Figures
PreviousNext
Review Article
Open Access January 22, 2025

Tech Transformations: Modern Solutions for Obstructive Sleep Apnea

Abstract Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in [...] Read more.
Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in patients with a high pre-test probability of OSA. In terms of diagnosis, advancements in wearable technology and mobile health applications have enabled continuous monitoring of sleep patterns and respiratory parameters. These tools provide valuable data that can be used to identify OSA more accurately and promptly. Additionally, machine learning algorithms are being integrated into diagnostic processes to enhance the accuracy of OSA detection by analyzing large datasets and identifying patterns indicative of the condition. Management of OSA has also seen significant progress. Continuous positive airway pressure (CPAP) therapy remains the gold standard, but new developments include auto-adjusting CPAP devices that optimize pressure settings based on real-time feedback. Mandibular advancement devices and hypoglossal nerve stimulation are emerging as effective alternatives for patients who are CPAP-intolerant. Furthermore, lifestyle interventions such as weight management, positional therapy, and exercise have been shown to complement medical treatments, leading to better overall outcomes. This review article highlights these advancements that collectively contribute to improved patient adherence, reduced symptoms, and enhanced quality of life for individuals with OSA.
Review Article
Open Access November 16, 2024

Digital Therapeutics: A New Dimension to Diabetes Mellitus Management

Abstract Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle [...] Read more.
Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabetes control. The importance of DTx lies in their ability to make diabetes care more accessible and convenient. Mobile apps and telemedicine platforms enable patients to receive support and guidance from anywhere, reducing the need for frequent in-person visits. Additionally, DTx often include behavioral support features like reminders, educational content, and motivational tools, which are crucial for maintaining healthy habits and managing stress. Currently, the dynamics of DTx in diabetes are rapidly evolving, with increasing integration of artificial intelligence and machine learning to further personalize and optimize care. As the adoption of these technologies grows, they hold the potential to significantly improve patient outcomes and revolutionize diabetes management on a global scale. This article will focus on the benefits of novel digital therapeutics for prevention and management of type II diabetes that are currently available in the market.
Figures
PreviousNext
Article
Open Access July 21, 2024

Securing Pharmaceutical Supply chain to Combat Active Pharmaceutical Ingredient Counterfeiting

Abstract Pharmaceutical Product serialization aims to assign distinct serial numbers to items within a pharmaceutical supply chain. However, this process faces several security challenges like Theft of valid serial numbers may occur, enabling the labelling of counterfeit products. Therefore, it's essential to ensure the uniqueness of serial numbers can be verified at any point in the product's lifecycle [...] Read more.
Pharmaceutical Product serialization aims to assign distinct serial numbers to items within a pharmaceutical supply chain. However, this process faces several security challenges like Theft of valid serial numbers may occur, enabling the labelling of counterfeit products. Therefore, it's essential to ensure the uniqueness of serial numbers can be verified at any point in the product's lifecycle within the supply chain. Intimidatory nodes along the distribution network could corrupt planned changes of custody for products. Ensuring verifiability of compliance with these changes is crucial. Manufacturers and consumers need assurance that perishable goods with expired shelf lives are appropriately discarded. In this paper, we review a product serialization method leveraging blockchain technology to address these security concerns within a multi-party perishable goods supply chain. Blockchains offer potential solutions by providing a secure platform for data sharing in multi-party environments, enhancing security and transparency. Within Blockchain technology, each distribution partner is registered to uphold transparency regarding drug information. The system facilitates real-time transfer of ownership changes, recording them as blocks with date and time stamps. This ensures visibility to all partners in real time, maintaining the authenticity of drugs. This article aims to outline how Blockchain technology benefits the pharmaceutical industry by enhancing traceability and trackability of drugs throughout the entire pharmaceutical supply chain.
Review Article
Open Access June 28, 2024

Nigeria Exchange Rate Volatility: A Comparative Study of Recurrent Neural Network LSTM and Exponential Generalized Autoregressive Conditional Heteroskedasticity Models

Abstract Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long [...] Read more.
Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long short term memory (LSTM) model with the combinations of p = 10 and q = 1 layers to model the volatility of Nigerian exchange rates. Our goal is to determine the preferred model for predicting Nigeria’s Naira exchange rate volatility with Euro, Pounds and US Dollars. The dataset of monthly exchange rates of the Nigerian Naira to US dollar, Euro and Pound Sterling for the period December 2001 – August 2023 was extracted from the Central Bank of Nigeria Statistical Bulletin. The model efficiency and performance was measured with the Mean Squared Error (MSE) criteria. The results indicated that the Nigeria exchange rate volatility is asymmetric, and leverage effects are evident in the results of the EGARCH (1, 1) model. It was observed also that there is a steady increase in the Nigeria Naira exchange rate with the euro, pounds sterling and US dollar from 2016 to its highest peak in 2023. Result of the comparative analysis indicated that, EGARCH (1,1) performed better than the LSTM model because it provided a smaller MSE values of 224.7, 231.3 and 138.5 for euros, pounds sterling and US Dollars respectively.
Figures
PreviousNext
Article
Open Access May 14, 2024

A review of reliability techniques for the evaluation of Programmable logic controller

Abstract PLCs, or programmable logic controllers, are essential parts of contemporary industrial automation systems and are responsible for managing and keeping an eye on a variety of operations. PLC reliability is critical to maintaining industrial systems' continuous and secure operation. A wide range of reliability strategies were used to improve the reliability of Programmable Logic Controllers, and [...] Read more.
PLCs, or programmable logic controllers, are essential parts of contemporary industrial automation systems and are responsible for managing and keeping an eye on a variety of operations. PLC reliability is critical to maintaining industrial systems' continuous and secure operation. A wide range of reliability strategies were used to improve the reliability of Programmable Logic Controllers, and this article methodically looks at them all. The evaluation classified PLC reliability techniques into Root Cause Analysis (RCA), Reliability Centered Maintenance (RCM), Hazard analysis (HA), Reliability block diagram (RBD), Fault tree analysis (FTA), Physics of failure (PoF) and FMEA/FMECA, after thoroughly reviewing the body of literature. The proportion of reviewed papers using either RCA, RCM, FMEA/FMECA, FTA, RBD, RCM, PoF, or Hazard analysis to increase the reliability of PLCs showed that RCA, which makes up 20% of the publications reviewed, has been used the most to increase the reliability of the PLC, followed by HA, RCM, RBD, FTA, and PoF, which account for 17%, 16%, 16%,13%, 10%, and 8% of the articles reviewed, respectively. The paper discusses new developments and trends in PLC reliability, such as the application of machine learning (ML) and artificial intelligence (AI) to fault detection and predictive maintenance.
Figures
PreviousNext
Review Article
Open Access April 16, 2024

Revolutionizing Automotive Supply Chain: Enhancing Inventory Management with AI and Machine Learning

Abstract Consumer behavior is evolving, demanding a wide range of products with fast shipping and reliable service. The automotive aftermarket industry, worth billions, requires efficient distribution systems to stay competitive. Manufacturers strive to balance growth with product and service excellence. Distributors and retailers face the challenge of maintaining competitive pricing while keeping [...] Read more.
Consumer behavior is evolving, demanding a wide range of products with fast shipping and reliable service. The automotive aftermarket industry, worth billions, requires efficient distribution systems to stay competitive. Manufacturers strive to balance growth with product and service excellence. Distributors and retailers face the challenge of maintaining competitive pricing while keeping inventory levels low. An adequate supply chain and accurate product data are crucial for product availability and reducing stock issues. This ultimately increases profits and customer satisfaction.
Figures
PreviousNext
Article
Open Access November 15, 2023

Predictive Failure Analytics in Critical Automotive Applications: Enhancing Reliability and Safety through Advanced AI Techniques

Abstract Failure prediction can be achieved through prognostics, which provides timely warnings before failure. Failure prediction is crucial in an effective prognostic system, allowing preventive maintenance actions to avoid downtime. The prognostics problem involves estimating the remaining useful life (RUL) of a system or component at any given time. The RUL is defined as the time from the current time [...] Read more.
Failure prediction can be achieved through prognostics, which provides timely warnings before failure. Failure prediction is crucial in an effective prognostic system, allowing preventive maintenance actions to avoid downtime. The prognostics problem involves estimating the remaining useful life (RUL) of a system or component at any given time. The RUL is defined as the time from the current time to the time of failure. The goal is to make accurate predictions close to the failure time to provide early warnings. J S Grewal and J. Grewal provide a comprehensive definition of RUL in their paper "The Kalman Filter approach to RUL estimation." A process is a quadruple (XU f P), where X is the state space, U is the control space, P is the set of possible paths, and f represents the transition between states. The process involves applying control values to change the system's state over time.
Figures
PreviousNext
Article
Open Access April 11, 2024

5V’s of Big Data Shifted to Suite the Context of Software Code: Big Code for Big Software Projects

Abstract Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the [...] Read more.
Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the codebases of the software’s are increasingly growing that is the size of the modules, functionalities, the size of the classes etc. Since data is growing so rapidly it also mean the codebases of software’s or code are also growing as well. Therefore, this paper seeks to discuss the 5V’s of big data in the context of software code and how to optimize or manage the big code. When we talk of "Big Code for Big Software's," we are referring to the specific challenges and considerations involved in developing, managing, and maintaining of code in large-scale software systems.
Article
Open Access March 06, 2024

Embedded Architecture of SAP S/4 HANA ERP Application

Abstract The SAP HANA Application to handle operational workloads that are consistent with transactions while also supporting intricate business analytics operations. Technically speaking, the SAP HANA database is made up of several data processing engines that work together with a distributed query processing environment to provide the entire range of data processing capabilities. This includes graph and [...] Read more.
The SAP HANA Application to handle operational workloads that are consistent with transactions while also supporting intricate business analytics operations. Technically speaking, the SAP HANA database is made up of several data processing engines that work together with a distributed query processing environment to provide the entire range of data processing capabilities. This includes graph and text processing for managing semi-structured and unstructured data within the same system, as well as classical relational data that supports both row- and column-oriented physical representations in a hybrid engine. The next-generation SAP Business Suite program designed specifically for the SAP HANA Platform is called SAP S/4HANA. The key features of SAP S/4HANA are an intuitive, contemporary user interface (SAP Fiori); planning and simulation options in many conventional transactions; simplification of business processes; significantly improved transaction efficiency; faster analytics.
Review Article
Open Access February 19, 2024

The use of contemporary Enterprise Resource Planning (ERP) technologies for digital transformation

Abstract Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with [...] Read more.
Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with numerous, dispersed, and very small structures that are made possible by digitization. Utilizing the possibilities of cloud computing, mobile systems, big data and analytics, services computing, Internet of Things, collaborative networks, and decision support, numerous new business prospects have emerged throughout the years. The logical basis for robust and self-optimizing run-time environments for intelligent business services and adaptable distributed information systems with service-oriented enterprise architectures comes from biological metaphors of living, dynamic ecosystems. This has a significant effect on how digital services and products are designed from a value- and service-oriented perspective. The evolution of enterprise architectures and the shift from a closed-world modeling environment to a more flexible open-world composition establish the dynamic framework for highly distributed and adaptive systems, which are crucial for enabling the digital transformation. This study examines how enterprise architecture has changed over time, taking into account newly established, value-based relationships between digital business models, digital strategies, and enhanced enterprise architecture.
Review Article
Open Access February 18, 2024

An Appraisal of Challenges in Developing Information Literacy Skills in the Colleges of Education of Ghana

Abstract The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North [...] Read more.
The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North Region. Purposive, stratified, and convenience sampling techniques were used to select colleges of education and level 200 students. The three (3) colleges of education were stratified and purposively selected while 256 level 200 students were stratified and conveniently sampled. The study employed questionnaires to collect data from the sampled students. Questionnaires (open and closed-ended questions) focused on the challenges faced by the students in developing their Information Literacy (IL) skills. The quantitative data was captured, analysed, and presented in descriptive statistics such as percentages, and frequency tables, to determine the objective of the study. It is recommended that to improve digital literacy and academic pursuits, the college management should improve access to desktop computers and the Internet in the library and computer centre. It is also recommended that Management and librarians of the Colleges of Education ensure that students have access to these devices at the library and can use them to develop their IL skills and help them manage their references more effectively.
Article
Open Access February 17, 2024

Universal Evaluation of SAP S/4 Hana ERP Cloud System

Abstract Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of [...] Read more.
Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of achieving maximum productivity is not fully utilized. One of the causes of this reality is the underfunding of ergonomic measures and the newest technologies. Through the design of S4 Hana cloud ERP software applications, we will demonstrate how important and highly recommended ergonomic research is in order to minimize the financial and human costs that enterprises are currently facing.
Figures
PreviousNext
Review Article
Open Access February 15, 2024

Stock Closing Price and Trend Prediction with LSTM-RNN

Abstract The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use [...] Read more.
The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing price of the Amazon Inc. stock (ticker symbol: AMZN). We study the influence of various hyperparameters in the model to see what factors the predictive power of the model. The root mean squared error (RMSE) on the training was 2.51 with a mean absolute percentage error (MAPE) of 1.84%.
Figures
PreviousNext
Article
Open Access December 06, 2023

Success Factors of Adopting Cloud Enterprise Resource Planning

Abstract The technologies for cloud ERP (Enterprise Resource Planning) have revolutionized the field of information technologies. Any kind of business can benefit from their flexibility, affordability, scalability, adaptation, availability, and customizable data. An advancement of classic ERP, cloud enterprise resource planning (C-ERP) provides the benefits of cloud computing (CC), including resource [...] Read more.
The technologies for cloud ERP (Enterprise Resource Planning) have revolutionized the field of information technologies. Any kind of business can benefit from their flexibility, affordability, scalability, adaptation, availability, and customizable data. An advancement of classic ERP, cloud enterprise resource planning (C-ERP) provides the benefits of cloud computing (CC), including resource elasticity and ease of use. The rise of cloud computing affects on-premise ERP systems in terms of architecture and cost. Cloud-based ERP systems make the claim to be appropriate for digital corporate settings. System quality, security, vendor lock-in, and data accessibility are recognized as the technological issues. Industry 4.0 refers to the re-engineering and revitalization of modern factories through the integration of cloud-based operations, industrial internet connectivity, additive manufacturing, and cybersecurity platforms. One of the four main pillars of Industry 4.0, cloud-based Enterprise Resource Planning (Cloud ERP), is a component of cloud operations that aids in achieving greater standards of sustainable performance.
Figures
PreviousNext
Review Article
Open Access December 03, 2023

Evolution of Enterprise Applications through Emerging Technologies

Abstract The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various [...] Read more.
The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various industries. Grasping the concept of artificial intelligence and its application in diverse business applications is crucial, given its broad and intricate nature. The primary focus of this paper is to delve into the realm of artificial intelligence and its utilization within enterprise resource planning. The study not only explores artificial intelligence but also delves into related concepts such as machine learning, deep learning, and neural networks in greater detail. Drawing upon existing literature, this research examines various books and online resources discussing the intersection of artificial intelligence and ERP. The findings reveal that the impact of AI is evident as businesses attain heightened levels of analytical efficiency across different ERP domains, thanks to remarkable advancements in AI, machine learning, and deep learning. Artificial intelligence is extensively employed in numerous ERP areas, with a particular emphasis on customer support, predictive analysis, operational planning, and sales projections.
Review Article
Open Access February 23, 2023

Substituting Intelligence

Abstract The development of ChatGPT is a topical subject of reflection. This short paper focuses on the (possible) use of ChatGPT in academia and some of its (possible) ramifications for users’ cognitive abilities and, dramatically put, their existence.
The development of ChatGPT is a topical subject of reflection. This short paper focuses on the (possible) use of ChatGPT in academia and some of its (possible) ramifications for users’ cognitive abilities and, dramatically put, their existence.
Communication
Open Access December 14, 2022

Applying Artificial Intelligence (AI) for Mitigation Climate Change Consequences of the Natural Disasters

Abstract Climate change and weather-related disasters are speeded very fast in the last decades with the consequences bringing to humanity: insecurity, destructing the ecological systems, increasing poverty, human victims, and economical losses everywhere on the planet. The innovative methods applied to mitigate the magnitudes of natural disasters and to combat effectively their negative impact consist of [...] Read more.
Climate change and weather-related disasters are speeded very fast in the last decades with the consequences bringing to humanity: insecurity, destructing the ecological systems, increasing poverty, human victims, and economical losses everywhere on the planet. The innovative methods applied to mitigate the magnitudes of natural disasters and to combat effectively their negative impact consist of remote and earth constantly monitoring, data collection, creation of models for big data extrapolation, prediction, in-time warning for prevention, and others. Artificial intelligence (AI) is used to deal with big data, for calculations, forecasts, predictions of natural disasters in the near future, the establishment of the possibilities to escape the hazards or risky situations, as well as to prepare the human being for adverse changes, and drawing the different choices as assistance the right decision to be accepted. Many projects, programs, and frameworks are adopted and carried out the separate governments and business makers to common goals and actions for the formation of a friendly environment and measures for reducing undesired climate alterations and cataclysms. The aim of the article is to review the last programs and innovations applied in the mitigation of climate change using AI.
Figures
PreviousNext
Brief Review
Open Access November 04, 2022

An Artificial Intelligence Approach to Manage Crop Water Requirements in South Africa

Abstract Estimation of crop water requirements is of paramount importance towards the management of agricultural water resources, which is a major mitigating strategy against the effects of climate change on food security. South Africa water shortage poses a threat on agricultural efficiency. Since irrigation uses about 60% of the fresh water available, it therefore becomes important to optimise the use of [...] Read more.
Estimation of crop water requirements is of paramount importance towards the management of agricultural water resources, which is a major mitigating strategy against the effects of climate change on food security. South Africa water shortage poses a threat on agricultural efficiency. Since irrigation uses about 60% of the fresh water available, it therefore becomes important to optimise the use of irrigation water in order to maximize crop yield at the farm level in order to avoid wastage. In this study, combined application of an artificial neural network (ANN) and a crop – growth simulation model for the estimation of crop irrigation water requirements and the irrigation scheduling of potatoes at Winterton irrigation scheme, South Africa was investigated. The crop-water demand from planting to harvest date, when to irrigate, the optimum stage in the drying cycle when to apply water and the amount of irrigation water to be applied per time, were estimated in this study. Five feed –forward backward propagation artificial neural network predictive models were developed with varied number of neurons and hidden layers and evaluated. The optimal ANN model, which has 5 inputs, 5 neurons, 1 hidden layer and 1 output was used to predict monthly reference evapotranspiration (ETo) in the Winterton area. The optimal ANN model produced a root-mean-square error (RMSE) of 0.67, Pearson correlation coefficient (r) of 0.97 and coefficient of determination (R2) of 0.94. The validation of the model between the measured and predicted ETo shows a r value of 0.9048. The predicted ETo was one of the input variables into a crop growth simulation model, called CROPWAT. The results indicated that the total crop water requirement was 1259.2 mm/decade and net irrigation water requirement was 1276.9 mm/decade, spread over a 5-day irrigation time during the entire 140 days of cropping season for potatoes. A combination of the artificial neural networks and the crop growth simulation models have proved to be a robust technique for estimating crop irrigation water requirements in the face of limited or no daily meteorological datasets.
Figures
PreviousNext
Article
Open Access July 10, 2022

Digital Therapeutics in Oncology: A Better Outlook for Cancer Patients in the Future

Abstract Digital therapeutics (DTx) is an evidence-based treatment that makes use of high-quality software. As many healthcare systems confront increasing expectations for quality results, the need for digital medications is steadily growing in the clinical arena. To ensure that patients are supported during chemotherapy and that needless hospital visits are avoided, digital therapeutics must be integrated [...] Read more.
Digital therapeutics (DTx) is an evidence-based treatment that makes use of high-quality software. As many healthcare systems confront increasing expectations for quality results, the need for digital medications is steadily growing in the clinical arena. To ensure that patients are supported during chemotherapy and that needless hospital visits are avoided, digital therapeutics must be integrated into the cancer care pathway. Oncology patients are usually immunocompromised die to their disease and treatment, rendering them more susceptible to infection than the general population. As a result, visiting to a hospital might endanger their health. In addition, when cancer patients and survivors return home after treatment, digital health interventions provide them with the tools they need to manage their illness and its side effects in the privacy of their own homes. Considering the increasing prevalence of cancer patients and the solution that digital therapeutics has to offer in oncology, its future looks promising. This review article aims to summarize the existing companies in this domain, while evaluating the prospects as well.
Review Article
Open Access April 28, 2022

Analysis of Network Modeling for Real-world Recommender Systems

Abstract Nowadays, recommendation systems are existing everywhere in the internet world, online people are presented with the required needs not just for actual physical products, but also for several other things such as songs, places, books, friends, movies, and many more requirements. Most of the systems are developed with the basic collaborative and hybrid filtering, where the people or users are [...] Read more.
Nowadays, recommendation systems are existing everywhere in the internet world, online people are presented with the required needs not just for actual physical products, but also for several other things such as songs, places, books, friends, movies, and many more requirements. Most of the systems are developed with the basic collaborative and hybrid filtering, where the people or users are recommended items that the choices are based on the right preferences of other people by applying the machine intelligence strategies. In this research, the importance of network modeling is analyzed in solving real-world problems.
Figures
PreviousNext
Article
Open Access December 27, 2021

A Comparative Study for Recommended Triage Accuracy of AI Based Triage System MayaMD with Indian HCPs

Abstract Artificial intelligence (AI) based triage and diagnostic systems are increasingly being used in healthcare. Although these online tools can improve patient care, their reliability and accuracy remain variable. We hypothesized that an artificial intelligence (AI) powered triage and diagnostic system (MayaMD) would compare favorably with human doctors with respect to triage and diagnostic accuracy. [...] Read more.
Artificial intelligence (AI) based triage and diagnostic systems are increasingly being used in healthcare. Although these online tools can improve patient care, their reliability and accuracy remain variable. We hypothesized that an artificial intelligence (AI) powered triage and diagnostic system (MayaMD) would compare favorably with human doctors with respect to triage and diagnostic accuracy. We performed a prospective validation study of the accuracy and safety of an AI powered triage and diagnostic system. Identical cases were evaluated by an AI system and individual Indian healthcare practitioners (HCPs) to draw comparison for accuracy and safety. The same cases were validated with the help of consensus received from an expert panel of 3 doctors. These cases in the form of clinical vignettes were provided by an expert medical team. Overall, the study showed that the MayaMD AI based platform for virtual triage was able to recommend the most appropriate triage ensuring patient safety. In fact, the accuracy of triage recommendation by MayaMD was significantly better than that provided by individual HCPs (74% vs. 91.67%, p=0.04) with consensus being used as standard.
Figures
PreviousNext
Article
Open Access October 19, 2021

A Ligthweight Wayfinding Assistance System for IoT Applications

Abstract In this paper, we propose to design an indoor sign detection system for industry 4.0. In order to implement the proposed system, we proposed a lightweight deep learning-based architecture based on MobileNet which can be run on embedded devices used to detect and recognize indoor landmarks signs in order to assist blind and sighted during indoor navigation. We apply various operations in order to [...] Read more.
In this paper, we propose to design an indoor sign detection system for industry 4.0. In order to implement the proposed system, we proposed a lightweight deep learning-based architecture based on MobileNet which can be run on embedded devices used to detect and recognize indoor landmarks signs in order to assist blind and sighted during indoor navigation. We apply various operations in order to minimize the network size as well as computation complexity. Internet of things (IoT) presents a connection between internet and the surroundings objects. IoT is characterized to connect physical objects with their numerical identities and enables them to connect with each other. This technique creates a kind of bridge between the physical world and the virtual world. The paper provides a comprehensive overview of a new method for a set of landmark indoor sign objects based on deep convolutional neural network (DCNN) for internet of things applications.
Figures
PreviousNext
Article
Open Access October 17, 2021

Understanding Traffic Signs by an Intelligent Advanced Driving Assistance System for Smart Vehicles

Abstract Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a [...] Read more.
Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a huge number of sensors and processing units to provide a complete overview of the surrounding objects to the driver. In this paper, we introduce a road signs classifier for an ADAS to recognize and understand traffic signs. This classifier is based on a deep learning technique, and, in particular, it uses Convolutional Neural Networks (CNN). The proposed approach is composed of two stages. The first stage is a data preprocessing technique to filter and enhance the quality of the input images to reduce the processing time and improve the recognition accuracy. The second stage is a convolutional CNN model with a skip connection that allows passing semantic features to the top of the network in order to allow for better recognition of traffic signs. Experiments have proved the performance of the CNN model for traffic sign classification with a correct recognition rate of 99.75% on the German traffic sign recognition benchmark GTSRB dataset.
Figures
PreviousNext
Article
Open Access September 04, 2021

Active Fault Tolerant Control of Faulty Uncertain Neutral Time-Delay Systems

Abstract The present paper attempts to investigate the problem of Fault Tolerant Control for a class of uncertain neutral time delay systems. In the first time, we consider an additive control that is based on adding a term to the nominal law when the fault occurs. This approach will be designed in three steps. The first step is fault detection while the second one is fault estimation. For these two steps, [...] Read more.
The present paper attempts to investigate the problem of Fault Tolerant Control for a class of uncertain neutral time delay systems. In the first time, we consider an additive control that is based on adding a term to the nominal law when the fault occurs. This approach will be designed in three steps. The first step is fault detection while the second one is fault estimation. For these two steps, we consider the adaptive observer to guarantee the detection and estimation of the fault. The third step is the fault compensation. Lyapunov method and Linear Matrix Inequality (LMI) techniques were considered to improve the main method. Second, we propose a Pseudo Inverse Method "PIM" and determine the error between the closed loop and the nominal system. Finally, simulation results are presented to prove the theoretical development for an example of an uncertain neutral time delay system.
Figures
PreviousNext
Article
Open Access July 23, 2021

Behavioral Economics and Energy Consumption: Behavioral Data Analysis the Role of Attitudes and Beliefs on Household Electricity Consumption in Iran

Abstract The average electricity consumption in Iranian households is higher than the world average. This can be due to price factors (such as cheap electricity in the country) and non-price factors (such as socio-demographic variables and psychological factors). In this study, non-price factors such as socio-demographic variables and psychological factors in the electricity consumption of urban households [...] Read more.
The average electricity consumption in Iranian households is higher than the world average. This can be due to price factors (such as cheap electricity in the country) and non-price factors (such as socio-demographic variables and psychological factors). In this study, non-price factors such as socio-demographic variables and psychological factors in the electricity consumption of urban households in Tehran were investigated. In this regard, using the theoretical foundations of behavioral economics and the psychology of planned behavior, this issue was analyzed. This study collected information on household electricity consumption behavior through a questionnaire and fieldwork from 2560 Tehran households. Results Using econometric techniques, linear regression was estimated, the dependent variable of which was electricity consumption (45 days in winter 2019) and its independent variables including socio-demographic variables (age, sex, number of household members, income) and The variables of the theory of planned behavior (attitude, mental norms and perceived behavioral control) showed that income and the number of household members have a significant and positive effect on electricity consumption, but gender has no significant effect. Of the psychological variables, only perceived behavioral control has a significant effect on electricity consumption. These results show that the consumer does not have a positive attitude towards saving, and mental and social norms do not encourage him to reduce electricity consumption, and they are not effective in consumption control. Finally, the study results were analyzed using behavioral biases that may cause attitudes and beliefs not to lead to action.
Figures
PreviousNext
Article
Open Access December 27, 2021

Leveraging AI and ML for Enhanced Efficiency and Innovation in Manufacturing: A Comparative Analysis

Abstract The manufacturing industry has embraced modern technologies such as big data, machine learning, and artificial intelligence. This paper examines AI and machine learning developments in the manufacturing industry, comparing current practices and data-driven projects. It aims better to understand these technologies and their potential benefits and challenges. The research identifies opportunities [...] Read more.
The manufacturing industry has embraced modern technologies such as big data, machine learning, and artificial intelligence. This paper examines AI and machine learning developments in the manufacturing industry, comparing current practices and data-driven projects. It aims better to understand these technologies and their potential benefits and challenges. The research identifies opportunities for innovative business solutions and explores industry practices and research results. The paper focuses on implementation rather than technical aspects, aiming to enhance knowledge in this area.
Figures
PreviousNext
Review Article
Open Access August 20, 2022

Advancing Predictive Failure Analytics in Automotive Safety: AI-Driven Approaches for School Buses and Commercial Trucks

Abstract The recent evidence on AI in automotive safety shows the potential to reduce crashes and improve efficiency. Studies used AI techniques like machine learning and predictive analytics models to develop predictive collision avoidance systems. The studies collected data from various sources, such as traffic collision data and shapefiles. They utilized deep learning neural networks and 3D [...] Read more.
The recent evidence on AI in automotive safety shows the potential to reduce crashes and improve efficiency. Studies used AI techniques like machine learning and predictive analytics models to develop predictive collision avoidance systems. The studies collected data from various sources, such as traffic collision data and shapefiles. They utilized deep learning neural networks and 3D visualization techniques to analyze the data. However, there needs to be more research on AI in school bus and commercial truck safety. This paper explores the importance of AI-driven predictive failure analytics in enhancing automotive safety for these vehicles. It will discuss challenges, required data, technologies involved in predictive failure analytics, and the potential benefits and implications for the future. The conclusion will summarize the findings and emphasize the significance of AI in improving driver safety. Overall, this paper contributes to the field of automotive safety and aims to attract more research in this area.
Figures
PreviousNext
Review Article
Open Access August 29, 2022

From Deterministic to Data-Driven: AI and Machine Learning for Next-Generation Production Line Optimization

Abstract The advancement of modern manufacturing is synonymous with the growth of automation. Automation replaces human operators, improves productivity and quality, and reduces costs. However, the initial financial cost and knowledge requirements can be barriers to embracing automation. Manufacturers are now seeking smart manufacturing, known as the fourth industrial revolution. Smart manufacturing goes [...] Read more.
The advancement of modern manufacturing is synonymous with the growth of automation. Automation replaces human operators, improves productivity and quality, and reduces costs. However, the initial financial cost and knowledge requirements can be barriers to embracing automation. Manufacturers are now seeking smart manufacturing, known as the fourth industrial revolution. Smart manufacturing goes beyond automation and utilizes IoT, AI, and big data for optimized production. In a smart factory, production can be linked and controlled innovatively, leading to increased performance, agility, and reduced costs.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Exploring AI Algorithms for Cancer Classification and Prediction Using Electronic Health Records

Abstract Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer [...] Read more.
Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer detection, utilizing the publicly available CBIS-DDSM dataset, which comprises 5,000 images evenly divided between benign and malignant cases. To improve diagnostic accuracy, models such as Gaussian Naïve Bayes (GNB), CNNs, KNN, and MobileNetV2 were assessed employing performance measures including F1-score, recall, accuracy, and precision. The methodology involved data preprocessing techniques, including transfer learning and feature extraction, followed by data splitting for robust model training and evaluation. Findings indicate that MobileNetV2 achieved a highest accuracy99.4%, significantly outperforming GNB (87.2%), CNN (96.7%), and KNN (91.2%). The outstanding capacity of MobileNetV2 to identify between benign and malignant instances was shown by the investigation, which also made use of confusion matrices and ROC curves to evaluate model performance.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

An Effective Predicting E-Commerce Sales & Management System Based on Machine Learning Methods

Abstract Due to influence of Internet, this e-commerce sector has developed rapidly. Most of the online retailing or selling businesses are seeking for way for predicting their products demand. Sales forecasting may help retailers develop a sales strategy that will enhance sales and attract more money and investment. The current research work puts forward a machine learning framework to forecast E-commerce [...] Read more.
Due to influence of Internet, this e-commerce sector has developed rapidly. Most of the online retailing or selling businesses are seeking for way for predicting their products demand. Sales forecasting may help retailers develop a sales strategy that will enhance sales and attract more money and investment. The current research work puts forward a machine learning framework to forecast E-commerce sales for strategic management using a dataset of E-commerce transactions. With 70 percent of the data for train and 30 percent for test, three models were produced, namely, Random Forest, Decision Tree, and XGBoost. In order to evaluate the models, performance measures inclusive of R-squared (R²) and Root Mean Squared Error (RMSE) were employed. Thus, the XGBoost model was the most accurate in marketing predictive capabilities for E-commerce sales with the R² score of 96.3%. This has demonstrated the increased capability of XGBoost algorithm to forecast E-commerce monthly sales more accurately than other models and can assist decision makers for managing inventory and arriving smart and quick decisions in this rapidly growing E-commerce market. The findings reiterate the importance of using advanced analytics in order to drive effectiveness and customer experience within E-commerce sector.
Figures
PreviousNext
Review Article
Open Access October 15, 2022

Big Data and AI/ML in Threat Detection: A New Era of Cybersecurity

Abstract The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even [...] Read more.
The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even under pressure from regulatory boards, have strived to harness the power of data and leverage it to enhance safety and security, maximize performance, and mitigate risks. However, the adversaries themselves have capitalized on the unequal battle of big data and artificial intelligence to inflict widespread chaos. Therefore, the demand for big data analytics and AI/ML for high-fidelity intelligence, surveillance, and reconnaissance is at its highest. Today, in the cybersecurity realm, the detection of adverse incidents poses substantial challenges due to the sheer variety, volume, and velocity of deep packet inspection data. State-of-the-art detection techniques have fallen short of detecting the latest attacks after a big data breach incident. On the other hand, computational intelligence techniques such as machine learning have reignited the search for solutions for diverse monitoring problems. Recent advancements in AI/ML frameworks have the potential to analyze IoT/edge-generated big data in near real-time and assist risk assessment and mitigation through automated threat detection and modeling in the big data and AI/ML domain. Industry best practices and case studies are examined that endeavor to showcase how big data coupled with AI/ML unlocks new dimensions and capabilities in improved vigilance and monitoring, prediction of adverse incidents, intelligent modeling, and future uncertainty quantification by data resampling correction. All of these avenues lead to enhanced robustness, security, safety, and performance of industrial processes, computing, and infrastructures. A view of the future and how the potential threats due to the misuse of new technologies from bandwidth to IoT/edge, blockchain, AI, quantum, and autonomous fields is discussed. Cybersecurity is again playing out at a pace set by adversaries with low entry barriers and debilitating tools. The need for innovative solutions for defense from the emerging threat landscape, harnessing the power of new technologies and collaboration, is emphasized.
Figures
PreviousNext
Article
Open Access December 27, 2021

Leveraging AI in Urban Traffic Management: Addressing Congestion and Traffic Flow with Intelligent Systems

Abstract Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. [...] Read more.
Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. From an urban transportation standpoint, an immediate consideration on one hand is monitoring traffic conditions and demand cycles, while on the other hand inducing flow modifications that benefit the traffic network and mitigate congestion. Embedded and centralized control systems that characterize modern traffic management systems extract traffic conditions specific to their regions but lack communication between networks. Moreover, innovative methods are required to provide more accurate up-to-date traffic forecasts that characterize real-world traffic dynamics and facilitate optimal traffic management decisions. In this chapter, we briefly outline the main difficulties and complexities in modeling, managing, and forecasting traffic dynamics. We also compare various conventional and modern Intelligent Transportation Strategies in terms of accuracy and applicability, their performance, and potential opportunities for optimization of multimodal traffic flow and congestion reduction. This chapter introduces various proposed data-driven models and tools employed for traffic flow prediction and management, investigating specific strategies' strengths, weaknesses, and benefits in addressing various real-world traffic management problems. We describe that the design phase of dependable Intelligent Transportation Systems bears unique requirements in terms of the robustness, safety, and response times of their components and the encompassing system model. Furthermore, this architectural blueprint shares similarities with distributed coordinate searching and collective adaptive systems. Town size-independent models induce systemic performance improvements through reconfigurable embedded functionality. These AI techniques feature elaborate anytime planner-engagers ensuring near-optimal performances in an unbiased behavior when the model complexity is varied. Sustainable models minimize congestion during peaks, flooding, and emergency occurrences as they adhere to area-specific regulations. Security-aware and fail-safe traffic management systems relinquish reasonable assurances of persistent operation under various environmental settings, to acknowledge metropolis and complex traffic junctions. The chapter concludes by outlining challenges, research questions, and future research paths in the field of transportation management.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Sustainability in Construction: Exploring the Development of Eco-Friendly Equipment

Abstract The equipment used in the construction industry is usually associated with a high impact on the environment. Although sustainable design has shown to be a main player among the initiatives focused on reducing environmental impact, it has been driven by the workers and processes, leaving the equipment endeavors in more restrictive and later stages. The equipment industry has been a constant target [...] Read more.
The equipment used in the construction industry is usually associated with a high impact on the environment. Although sustainable design has shown to be a main player among the initiatives focused on reducing environmental impact, it has been driven by the workers and processes, leaving the equipment endeavors in more restrictive and later stages. The equipment industry has been a constant target of environmental standards and economic pressure, but the increasing technological development allows it to respond to sustainability and safety expectations while enhancing its performance. However, there are still several limitations that lead this sector to be one of the last to reach upgrading levels in terms of development. A study identified some gaps in the equipment design that require a greater effort to effectively support the workers and companies towards sustainable construction. This chapter is based on a study aiming to understand the consolidated knowledge of technologically sustainable equipment design and to identify the challenges left for its full development. The findings support the development of innovative eco-friendly equipment, taking into consideration sustainable materials and product guidelines, as well as green economy initiatives. It also supports complex system approaches and safety by design specificities to establish a corporate knowledge of sustainable equipment and align it with the new regulations of the construction industry. The chapter introduces the context of construction equipment in terms of new challenges when faced with the need to provide construction work with a greater capacity for safety, from an environmental and energy efficiency perspective, and within the paradigm of sustainability. Then, it presents the concept of sustainable equipment considering its principles, followed by a characterization of the agents involved in its life cycle.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Financial Implications of Predictive Analytics in Vehicle Manufacturing: Insights for Budget Optimization and Resource Allocation

Abstract Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented [...] Read more.
Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented questions need to at least partially guide the decisions in the planning phase of data science projects. Data-driven approaches will play an increasingly important role, but only a few of the firms that were confident performed logistic regression models for predictive maintenance. Also, from the available knowledge, data-driven classification models connecting vehicle component failures and the occurrence of delays at the assembly line have not been published. This paper utilizes a real-world data-driven approach using classification models in predictive analytics by vehicle manufacturers and thereby links the financial implications of such data science projects to their results. We expand the existing literature on predictive maintenance and possess a unique dataset of newly launched series of vehicles, presented as-is. Our research context is of interest to researchers and practitioners in the automotive industry that manage and plan the final vehicle assembly with just-in-time principles, factoring the consequences of component failures on the assembly process. Key findings of this paper highlight that while minor tweaking of the models is possible, their potential input in decision-making processes for budget optimization is limited.
Figures
PreviousNext
Review Article
Open Access October 29, 2022

Neural Networks for Enhancing Rail Safety and Security: Real-Time Monitoring and Incident Prediction

Abstract The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the [...] Read more.
The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the improvements in monitoring can be achieved using artificial intelligence algorithms such as neural networks. Neural networks have been used to achieve real-time incident identification in monitoring the track quality in terms of classifying the graphical outputs of an ultrasonic system working with the rails and track bed, to predict incidents on the rail infrastructure due to transmission channels becoming blocked, and also to attempt scheduling preemptive and preventative maintenance. In terms of forecasting incidents and accidents on board the trains, neural networks have been used to model passenger behavior and optimize responses during a train station evacuation. In tackling the incidents and accidents occurring on rail transport, we contribute with two methodologies to detect anomalies in real-time and identify the level of security risk: at the maintenance level with personnel operating along the railways, and onboard passenger trains. These methodologies were evaluated on real-world datasets and shown to be able to achieve a high accuracy in the results. The results generated from these case studies also reveal the potential for network-wide applications, which could enhance security and safety on railway networks by offering the possibility of better managing network disruptions and more rapidly identifying security issues. The speed and coverage of the information generated through the implementation of these methodologies have implications in utilizing prediction for decision support and enhancing safety and security on board the rail network.
Figures
PreviousNext
Review Article
Open Access November 16, 2023

Innovations in Agricultural Machinery: Assessing the Impact of Advanced Technologies on Farm Efficiency

Abstract Progress in the development and adoption of technological innovations is instrumental in enhancing the efficiency of production systems across the globe. Through the introduction of cost-efficient and high-performing technologies, countries can both reduce the resource use intensity of their economies and boost the global supply of essential products. The focus of this study is to analyze the [...] Read more.
Progress in the development and adoption of technological innovations is instrumental in enhancing the efficiency of production systems across the globe. Through the introduction of cost-efficient and high-performing technologies, countries can both reduce the resource use intensity of their economies and boost the global supply of essential products. The focus of this study is to analyze the application of advanced machinery and mechanisms within the agricultural sector, a primary industry that acts as a major contributor to the gross domestic product (GDP) of many nations. Specifically, this paper provides an in-depth review of the latest impact assessments based on analytical and modeling tools conducted on agricultural machinery and production technologies. Our findings highlight the positive role played by scientific progress and innovation in driving the competitiveness, growth and improved sustainability of the agricultural sector. Over the years, advanced technologies have accelerated the development and modernization of machinery, equipment, and processes in farming. Typically, modern machinery and equipment have enabled large-scale production on farms, enhancing the cost-efficient use of both land and labor, as well as the capacity and timeliness in performing essential agricultural operations. The rapid diffusion of technical advancements has further contributed to resource savings, productivity growth, and the overall transformation of agricultural value chains. Accordingly, the implementation of appropriate enabling conditions is of vital importance in encouraging the widespread integration of technologies in agriculture, not only boosting productivity along the agri-food chain but also yielding widespread social, economic, and environmental benefits.
Figures
PreviousNext
Review Article
Open Access October 30, 2022

Towards Autonomous Analytics: The Evolution of Self-Service BI Platforms with Machine Learning Integration

Abstract Self-service business intelligence (BI) platforms have become essential applications for exploring, analyzing, and visualizing business data in various domains. Here, we envisage that the business intelligence platform will perform automatic and autonomous data analytics with minimal to no user interaction. We aim to offer a data-driven, intelligent, and scalable infrastructure that amplifies the [...] Read more.
Self-service business intelligence (BI) platforms have become essential applications for exploring, analyzing, and visualizing business data in various domains. Here, we envisage that the business intelligence platform will perform automatic and autonomous data analytics with minimal to no user interaction. We aim to offer a data-driven, intelligent, and scalable infrastructure that amplifies the advantages of BI systems and discovers hidden and complex insights from very large business datasets, which a business analyst can miss during manual exploratory data analysis. Towards our future vision of autonomous analytics, we propose a collective machine learning model repository with an integration layer for user-defined analytical goals within the BI platform. The proposed architecture can effectively reduce the cognitive load on users for repetitive tasks, democratizing data science expertise across data workers and facilitating a less experienced user community to develop and use advanced machine learning and statistical algorithms.
Figures
PreviousNext
Review Article
Open Access November 05, 2022

Application of Neural Networks in Optimizing Health Outcomes in Medicare Advantage and Supplement Plans

Abstract The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, [...] Read more.
The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, personalized treatment recommendations, and risk stratification. This paper explores the application of neural networks in enhancing health outcomes within the context of Medicare Advantage and Supplement plans. We review how deep learning models can be leveraged to predict patient risk, optimize resource allocation, and identify at-risk populations for preventive interventions. Additionally, we discuss the potential for neural networks to improve claims processing, reduce fraud, and streamline administrative burdens. By integrating various data sources, including medical records, claims data, and demographic information, neural networks enable more accurate and efficient decision-making processes. Ultimately, this approach can lead to better patient care, reduced healthcare costs, and improved satisfaction for beneficiaries of these programs. The paper concludes by highlighting the current limitations, ethical considerations, and future directions for AI adoption in the Medicare Advantage and Supplement sectors.
Figures
PreviousNext
Review Article

Query parameters

Keyword:  Artificial Intelligence

View options

Citations of

Views of

Downloads of