Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access February 06, 2026

Predictive Modeling of Public Sentiment Using Social Media Data and Natural Language Processing Techniques

Abstract Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled [...] Read more.
Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled tweets, and develops predictive models for binary sentiment classification using Naive Bayes, Logistic Regression, and the transformer-based BERT model. Experiments were conducted on a balanced subset of 12,000 tweets after comprehensive NLP preprocessing. Evaluation using accuracy, F1-score, and confusion matrices revealed that BERT significantly outperforms traditional models, achieving an accuracy of 89.5% and an F1-score of 0.89 by effectively modeling contextual and semantic nuances. In contrast, Naive Bayes and Logistic Regression demonstrated reasonable but consistently lower performance. To support practical deployment, we introduce SentiFeel, an interactive tool enabling real-time sentiment analysis. While resource constraints limited the dataset size and training epochs, future work will explore full corpus utilization and the inclusion of neutral sentiment classes. These findings underscore the potential of transformer models for enhanced public opinion monitoring, marketing analytics, and policy forecasting.
Figures
PreviousNext
Article
Open Access January 10, 2025

Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence

Abstract Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a [...] Read more.
Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS's transformative potential across diverse computational fields.
Figures
PreviousNext
Article
Open Access February 17, 2024

An Overview of Short- and Long-Term Adverse Outcomes and Complications of Perinatal Depression on Mother and Offspring

Abstract Antenatal and postpartum major depressive episode (MDE) according to Diagnostic and Statistical Manual of Mental Disorders 5th Edition (DSM-V) is defined as either daily sustained sad mood or lack of enjoyment or desire for a minimum two weeks plus four associated manifestations (only three if the two major symptoms are present) that start throughout pregnancy or during the first 4 weeks [...] Read more.
Antenatal and postpartum major depressive episode (MDE) according to Diagnostic and Statistical Manual of Mental Disorders 5th Edition (DSM-V) is defined as either daily sustained sad mood or lack of enjoyment or desire for a minimum two weeks plus four associated manifestations (only three if the two major symptoms are present) that start throughout pregnancy or during the first 4 weeks postpartum respectively: 1) Unintentional notable slimming up or down; 2) Sleepiness or sleeplessness; 3) Tiredness sensation; 4) Guilty or futility sensation; 5) Declined concentration capacity; 6) Frequent suicidal thoughts; 7) Psychomotor excitation or delay. Perinatal depression carries vital and adverse consequences on mother’s psychosocial aspects of life, pregnancy and delivery outcomes, her interrelations specifically with the new born with poorer overall health and influences negatively on offspring from the intrauterine life passing by complicated delivery experiencing hard unstable childhood reaching unhealthy adolescence and adulthood. These negative consequences necessitate a great attention for prevention, screening and prompt treatment for antenatal and postnatal depression to prevent such disastrous effects.
Brief Review
Open Access June 28, 2025

Development of a Hemodialysis Data Collection and Clinical Information System and Establishment of an Intradialytic Blood Pressure/Pulse Rate Predictive Model

Abstract This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models [...] Read more.
This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models, with subsequent suggestions provided. Both objectives were executed under the supervision of the Institutional Review Board (IRB) at Mackay Memorial Hospital in Taiwan. The system completed for objective one has introduced three significant services to the clinic, including automated hemodialysis data collection, digitized data storage, and an information-rich human-machine interface as well as graphical data displays, which replaces traditional paper-based clinical administrative operations, thereby enhancing healthcare efficiency. The graphical data presented through web and app interfaces aids in real-time, intuitive comprehension of the patients’ conditions during hemodialysis. Moreover, the data stored in the backend database is available for physicians to conduct relevant analyses, unearth insights into medical practices, and provide precise medical care for individual patients. The training and evaluation of the predictive models for objective two, along with related comparisons, analyses, and recommendations, suggest that in situations with limited computational resources and data, an Artificial Neural Network (ANN) model with six hidden layers, SELU activation function, and a focus on artery-related features can be employed for hourly intradialytic BP/PR prediction tasks. It is believed that this contributes to the collaborating clinic and relevant research communities.
Figures
Figure 3 (c)
Figure 3 (d)
Figure 4 (b)
Figure 4 (c)
Figure 4 (d)
Figure 4 (e)
Figure 4 (f)
Figure 4 (g)
Figure 4 (h)
Figure 5 (b)
Figure 6 (b)
Figure 6 (c)
Figure 6 (d)
Figure 6 (e)
Figure 6 (f)
Figure 7 (b)
Figure 7 (c)
Figure 7 (d)
Figure 7 (e)
Figure 7 (f)
Figure 7 (g)
Figure 8 (b)
Figure 8 (c)
Figure 8 (d)
Figure 9 (b)
Figure 9 (c)
Figure 9 (d)
Figure 10 (b)
Figure 10 (c)
Figure 10 (d)
Figure 10 (e)
Figure 10 (f)
Figure 11 (b)
Figure 11 (c)
Figure 11 (d)
Figure 11 (e)
PreviousNext
PDF Html Xml
Article
Open Access March 22, 2025

Enhancing Scalability and Performance in Analytics Data Acquisition through Spark Parallelism

Abstract Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale [...] Read more.
Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale REST API calls, enabling enhanced scalability and improved processing speeds to meet the demands of high volume data workflows.
Figures
PreviousNext
Review Article
Open Access February 26, 2025

Innovations and Challenges in Pharmaceutical Supply Chain, Serialization and Regulatory Landscape

Abstract The pharmaceutical supply chain has become increasingly complex and vulnerable to various risks, including counterfeit drugs, diversion, and fraud. As these challenges threaten patient safety and the integrity of global healthcare systems, serialization has emerged as a pivotal innovation in pharmaceutical logistics and regulatory compliance. Serialization involves assigning unique identifiers to [...] Read more.
The pharmaceutical supply chain has become increasingly complex and vulnerable to various risks, including counterfeit drugs, diversion, and fraud. As these challenges threaten patient safety and the integrity of global healthcare systems, serialization has emerged as a pivotal innovation in pharmaceutical logistics and regulatory compliance. Serialization involves assigning unique identifiers to individual drug packages, enabling precise tracking and authentication at every stage of the supply chain. This process provides unprecedented transparency, enhances product security, and facilitates real-time monitoring of pharmaceutical products as they move from manufacturers to end consumers. Despite its potential to revolutionize pharmaceutical traceability, the integration of serialization technologies faces numerous obstacles. These include high implementation costs, regulatory inconsistencies across regions, and the technological challenges of managing vast amounts of data. Moreover, the complex, multi-tiered nature of the global supply chain introduces additional risks related to data integrity, cybersecurity, and interoperability between systems. As pharmaceutical companies seek to navigate these challenges, innovations in serialization technology—such as blockchain, artificial intelligence (AI), the Internet of Things (IoT), and radio frequency identification (RFID)—are providing promising solutions to enhance efficiency, reduce fraud, and increase visibility. This manuscript explores both the innovative advancements and the key challenges associated with the integration of serialization in the pharmaceutical supply chain. It delves into the evolving regulatory landscape, highlighting the need for global harmonization of serialization standards, and examines the impact of serialization on securing pharmaceutical distribution networks. Additionally, the paper emphasizes the importance of collaboration among manufacturers, technology providers, and regulatory bodies in overcoming implementation barriers and realizing the full potential of serialization. As the pharmaceutical industry moves towards a more interconnected and data-driven future, serialization promises to play a central role in shaping the next generation of drug safety and supply chain management. By addressing the hurdles to adoption and leveraging emerging technologies, the pharmaceutical sector can create a more secure, transparent, and efficient supply chain that better serves public health and fosters greater trust among consumers and healthcare professionals alike.
Review Article
Open Access February 09, 2025

The Future of Longevity Medicine from the Lens of Digital Therapeutics

Abstract Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements [...] Read more.
Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements in DTx is the integration of artificial intelligence (AI) and machine learning (ML) to tailor interventions based on individual health data. This personalization enhances the effectiveness of treatments and supports preventive care by identifying risk factors early. The need for digital therapeutics is underscored by the rising prevalence of NCDs, which are responsible for a significant portion of global mortality and healthcare costs. Traditional healthcare systems often struggle to provide timely and personalized care, especially in low-resource settings. DTx can bridge this gap by offering cost-effective solutions that are easily scalable. Moreover, digital therapeutics can address health inequities by providing low-cost interventions to underserved populations, thereby reducing the burden of NCDs and improving overall health outcomes. As technology continues to evolve, the potential for DTx to enhance longevity and quality of life becomes increasingly promising. Recent advancements in longevity medicine and technology have focused on extending both lifespan and healthspan, ensuring that people not only live longer but also maintain good health throughout their extended years. This review article highlights these advancements that are contributing to this compelling subject of Longevity.
Figures
PreviousNext
Review Article
Open Access January 22, 2025

Tech Transformations: Modern Solutions for Obstructive Sleep Apnea

Abstract Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in [...] Read more.
Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in patients with a high pre-test probability of OSA. In terms of diagnosis, advancements in wearable technology and mobile health applications have enabled continuous monitoring of sleep patterns and respiratory parameters. These tools provide valuable data that can be used to identify OSA more accurately and promptly. Additionally, machine learning algorithms are being integrated into diagnostic processes to enhance the accuracy of OSA detection by analyzing large datasets and identifying patterns indicative of the condition. Management of OSA has also seen significant progress. Continuous positive airway pressure (CPAP) therapy remains the gold standard, but new developments include auto-adjusting CPAP devices that optimize pressure settings based on real-time feedback. Mandibular advancement devices and hypoglossal nerve stimulation are emerging as effective alternatives for patients who are CPAP-intolerant. Furthermore, lifestyle interventions such as weight management, positional therapy, and exercise have been shown to complement medical treatments, leading to better overall outcomes. This review article highlights these advancements that collectively contribute to improved patient adherence, reduced symptoms, and enhanced quality of life for individuals with OSA.
Review Article
Open Access November 16, 2024

Digital Therapeutics: A New Dimension to Diabetes Mellitus Management

Abstract Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle [...] Read more.
Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabetes control. The importance of DTx lies in their ability to make diabetes care more accessible and convenient. Mobile apps and telemedicine platforms enable patients to receive support and guidance from anywhere, reducing the need for frequent in-person visits. Additionally, DTx often include behavioral support features like reminders, educational content, and motivational tools, which are crucial for maintaining healthy habits and managing stress. Currently, the dynamics of DTx in diabetes are rapidly evolving, with increasing integration of artificial intelligence and machine learning to further personalize and optimize care. As the adoption of these technologies grows, they hold the potential to significantly improve patient outcomes and revolutionize diabetes management on a global scale. This article will focus on the benefits of novel digital therapeutics for prevention and management of type II diabetes that are currently available in the market.
Figures
PreviousNext
Article
Open Access July 10, 2024

Achieving Maintainability, Readability & Understandability of Software Projects using Code Smell Prediction

Abstract Maintenance of large-scale software is difficult due to large size and high complexity of code.80% of software development is on maintenance and the other 60% is on trying to understand the code. The severity of the code smells must be measured as well as fairness on it because it will help the developers especially in large scale source code projects. Code smell is not a bug in the system as it [...] Read more.
Maintenance of large-scale software is difficult due to large size and high complexity of code.80% of software development is on maintenance and the other 60% is on trying to understand the code. The severity of the code smells must be measured as well as fairness on it because it will help the developers especially in large scale source code projects. Code smell is not a bug in the system as it doesn’t prevent the program from functioning but it may increase the risk of software failure or performance slowdown. Therefore, this paper seeks to help developers with early prediction of severity of code smells and test the level of fairness on the predictions especially in large scale source code projects. Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the codebases of the software’s are increasingly growing that is the size of the modules, functionalities, the size of the classes etc. Since data is growing so rapidly it also mean the codebases of software’s or code are also growing as well. Therefore, this paper seeks to discuss the 5V’s of big data in the context of software code and how to optimize or manage the big code. When we talk of "Big Code for Big Software's," we are referring to the specific challenges and considerations involved in developing, managing, and maintaining of code in large-scale software systems.
Figures
PreviousNext
Technical Note
Open Access June 28, 2024

Nigeria Exchange Rate Volatility: A Comparative Study of Recurrent Neural Network LSTM and Exponential Generalized Autoregressive Conditional Heteroskedasticity Models

Abstract Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long [...] Read more.
Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long short term memory (LSTM) model with the combinations of p = 10 and q = 1 layers to model the volatility of Nigerian exchange rates. Our goal is to determine the preferred model for predicting Nigeria’s Naira exchange rate volatility with Euro, Pounds and US Dollars. The dataset of monthly exchange rates of the Nigerian Naira to US dollar, Euro and Pound Sterling for the period December 2001 – August 2023 was extracted from the Central Bank of Nigeria Statistical Bulletin. The model efficiency and performance was measured with the Mean Squared Error (MSE) criteria. The results indicated that the Nigeria exchange rate volatility is asymmetric, and leverage effects are evident in the results of the EGARCH (1, 1) model. It was observed also that there is a steady increase in the Nigeria Naira exchange rate with the euro, pounds sterling and US dollar from 2016 to its highest peak in 2023. Result of the comparative analysis indicated that, EGARCH (1,1) performed better than the LSTM model because it provided a smaller MSE values of 224.7, 231.3 and 138.5 for euros, pounds sterling and US Dollars respectively.
Figures
PreviousNext
Article
Open Access November 15, 2023

Predictive Failure Analytics in Critical Automotive Applications: Enhancing Reliability and Safety through Advanced AI Techniques

Abstract Failure prediction can be achieved through prognostics, which provides timely warnings before failure. Failure prediction is crucial in an effective prognostic system, allowing preventive maintenance actions to avoid downtime. The prognostics problem involves estimating the remaining useful life (RUL) of a system or component at any given time. The RUL is defined as the time from the current time [...] Read more.
Failure prediction can be achieved through prognostics, which provides timely warnings before failure. Failure prediction is crucial in an effective prognostic system, allowing preventive maintenance actions to avoid downtime. The prognostics problem involves estimating the remaining useful life (RUL) of a system or component at any given time. The RUL is defined as the time from the current time to the time of failure. The goal is to make accurate predictions close to the failure time to provide early warnings. J S Grewal and J. Grewal provide a comprehensive definition of RUL in their paper "The Kalman Filter approach to RUL estimation." A process is a quadruple (XU f P), where X is the state space, U is the control space, P is the set of possible paths, and f represents the transition between states. The process involves applying control values to change the system's state over time.
Figures
PreviousNext
Article
Open Access April 11, 2024

5V’s of Big Data Shifted to Suite the Context of Software Code: Big Code for Big Software Projects

Abstract Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the [...] Read more.
Data is the collection of facts and observations in terms of events, it is continuously growing, getting denser and more varied by the minute across different disciplines or fields. Hence, Big Data emerged and is evolving rapidly, the various types of data being processed are huge, but no one has ever thought of where this data resides, we therefore noticed this data resides in software’s and the codebases of the software’s are increasingly growing that is the size of the modules, functionalities, the size of the classes etc. Since data is growing so rapidly it also mean the codebases of software’s or code are also growing as well. Therefore, this paper seeks to discuss the 5V’s of big data in the context of software code and how to optimize or manage the big code. When we talk of "Big Code for Big Software's," we are referring to the specific challenges and considerations involved in developing, managing, and maintaining of code in large-scale software systems.
Article
Open Access March 06, 2024

The Advantages of Cloud ERP in the Global Business Landscape

Abstract Among the most significant systems that organizations of all stripes, whether public or private, use is the Enterprise Resource Planning (ERP) system. Due in large part to the rapid growth of Internet services and the growing reliance on the infrastructure of Cloud service providers, ERP design has advanced, and numerous types of Internet-service-dependent ERP systems have emerged. In addition to [...] Read more.
Among the most significant systems that organizations of all stripes, whether public or private, use is the Enterprise Resource Planning (ERP) system. Due in large part to the rapid growth of Internet services and the growing reliance on the infrastructure of Cloud service providers, ERP design has advanced, and numerous types of Internet-service-dependent ERP systems have emerged. In addition to the traditional ERP system, the most significant ERP types are Web-based ERP and Cloud ERP. As a result, ERP system vendors and designers, including Oracle and SAP, are relying on cloud-based ERP system design, and offering the ERP system as a service for monthly and annual subscription, where the system is external to the organization and does not need to exist within the organization.
Review Article
Open Access March 06, 2024

Embedded Architecture of SAP S/4 HANA ERP Application

Abstract The SAP HANA Application to handle operational workloads that are consistent with transactions while also supporting intricate business analytics operations. Technically speaking, the SAP HANA database is made up of several data processing engines that work together with a distributed query processing environment to provide the entire range of data processing capabilities. This includes graph and [...] Read more.
The SAP HANA Application to handle operational workloads that are consistent with transactions while also supporting intricate business analytics operations. Technically speaking, the SAP HANA database is made up of several data processing engines that work together with a distributed query processing environment to provide the entire range of data processing capabilities. This includes graph and text processing for managing semi-structured and unstructured data within the same system, as well as classical relational data that supports both row- and column-oriented physical representations in a hybrid engine. The next-generation SAP Business Suite program designed specifically for the SAP HANA Platform is called SAP S/4HANA. The key features of SAP S/4HANA are an intuitive, contemporary user interface (SAP Fiori); planning and simulation options in many conventional transactions; simplification of business processes; significantly improved transaction efficiency; faster analytics.
Review Article
Open Access February 19, 2024

The use of contemporary Enterprise Resource Planning (ERP) technologies for digital transformation

Abstract Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with [...] Read more.
Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with numerous, dispersed, and very small structures that are made possible by digitization. Utilizing the possibilities of cloud computing, mobile systems, big data and analytics, services computing, Internet of Things, collaborative networks, and decision support, numerous new business prospects have emerged throughout the years. The logical basis for robust and self-optimizing run-time environments for intelligent business services and adaptable distributed information systems with service-oriented enterprise architectures comes from biological metaphors of living, dynamic ecosystems. This has a significant effect on how digital services and products are designed from a value- and service-oriented perspective. The evolution of enterprise architectures and the shift from a closed-world modeling environment to a more flexible open-world composition establish the dynamic framework for highly distributed and adaptive systems, which are crucial for enabling the digital transformation. This study examines how enterprise architecture has changed over time, taking into account newly established, value-based relationships between digital business models, digital strategies, and enhanced enterprise architecture.
Review Article
Open Access February 18, 2024

An Appraisal of Challenges in Developing Information Literacy Skills in the Colleges of Education of Ghana

Abstract The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North [...] Read more.
The purpose of this study was to examine the challenges faced by students of Colleges of Education (CoEs) in developing their Information Literacy skills. The study adopted the post-positivism paradigm. Descriptive survey research design used in this study Survey. The population for this study comprised all Level 200 students at Wiawso CoE, Enchi CoE, and Bia Lamplighter CoE in the Western North Region. Purposive, stratified, and convenience sampling techniques were used to select colleges of education and level 200 students. The three (3) colleges of education were stratified and purposively selected while 256 level 200 students were stratified and conveniently sampled. The study employed questionnaires to collect data from the sampled students. Questionnaires (open and closed-ended questions) focused on the challenges faced by the students in developing their Information Literacy (IL) skills. The quantitative data was captured, analysed, and presented in descriptive statistics such as percentages, and frequency tables, to determine the objective of the study. It is recommended that to improve digital literacy and academic pursuits, the college management should improve access to desktop computers and the Internet in the library and computer centre. It is also recommended that Management and librarians of the Colleges of Education ensure that students have access to these devices at the library and can use them to develop their IL skills and help them manage their references more effectively.
Article
Open Access February 17, 2024

Universal Evaluation of SAP S/4 Hana ERP Cloud System

Abstract Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of [...] Read more.
Regardless of their traditional ERP Systems, it is essential for every business to acquire a universal advantage in the contemporary international market. When everything is considered, end users in these kinds of businesses have to deal with poorly designed interfaces and unusable technologies. Despite the claims of significant benefits from using S4 Hana cloud ERP software, the possibility of achieving maximum productivity is not fully utilized. One of the causes of this reality is the underfunding of ergonomic measures and the newest technologies. Through the design of S4 Hana cloud ERP software applications, we will demonstrate how important and highly recommended ergonomic research is in order to minimize the financial and human costs that enterprises are currently facing.
Figures
PreviousNext
Review Article
Open Access February 15, 2024

Stock Closing Price and Trend Prediction with LSTM-RNN

Abstract The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use [...] Read more.
The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing price of the Amazon Inc. stock (ticker symbol: AMZN). We study the influence of various hyperparameters in the model to see what factors the predictive power of the model. The root mean squared error (RMSE) on the training was 2.51 with a mean absolute percentage error (MAPE) of 1.84%.
Figures
PreviousNext
Article
Open Access January 07, 2024

Critical Success Factors of Cloud ERP in the Enterprise Business

Abstract Both crucial success and critical failure factors are included in the current review work. The method relies on creating surveys to collect optional data. It describes the terms that are used to obtain research papers on the ERP deployment in Enterprise Business from databases and scholarly research. In order to enhance the quality of papers, it also includes the consideration and restriction [...] Read more.
Both crucial success and critical failure factors are included in the current review work. The method relies on creating surveys to collect optional data. It describes the terms that are used to obtain research papers on the ERP deployment in Enterprise Business from databases and scholarly research. In order to enhance the quality of papers, it also includes the consideration and restriction criteria. At that time, a thorough audit of the available papers is conducted to determine the impact of ERP use in Enterprise Business. Important elements are found that determine whether ERP deployments are successful or unsuccessful, as well as how they actually affect Enterprise Business (insert actual success and failure variables here aside from impact). The time span during which research publications have been evaluated limits the scope of the study presented in this paper. One implicit drawback is that it only considers the state of the art in the field of study, without taking into account an empirical investigation. Nevertheless, its findings may prove advantageous, and the directions for future research aid in expanding the field of study. This work advances the body of knowledge regarding the potential benefits and drawbacks of ERP adoption for small and medium-sized enterprises. It uses a secondary data collection strategy to identify important success factors, important failure factors, and their impact. The insights will assist Enterprise Business, Enterprise Business' stakeholders, and ERP service providers in understanding the causes of success or failure and in taking the appropriate action.
Review Article
Open Access December 11, 2023

How Digital Technologies Improving Business Enterprise Applications

Abstract The review article presents how emerging technologies improves the business enterprise applications for process management. The paper considers certain technologies of enterprise applications and justifies the updated methodological and analytical tools for assessing, selecting, and regulating business processes in a single enterprise resource planning (ERP) system. Information technology must be [...] Read more.
The review article presents how emerging technologies improves the business enterprise applications for process management. The paper considers certain technologies of enterprise applications and justifies the updated methodological and analytical tools for assessing, selecting, and regulating business processes in a single enterprise resource planning (ERP) system. Information technology must be used to identify products, track their movement into and out of the warehouse using code scanning technology, and streamline the product management procedure. To increase the dependability of management techniques, guarantee that the business operates flawlessly, and maintain a regular management mode, the process management form should be implemented in the enterprise management process. The implementation of digital information technology is essential for achieving effective corporate management. In addition to providing ideal operational circumstances for businesses, it is essential to analyse information technology and manage businesses economically. The foundation for implementing the enterprise applications method strategy is the creation of a process management system and an in-depth, methodical review of the enterprise as a collection of processes. Process-oriented enterprise applications should be the foundation of contemporary novel technologies for modelling business processes. It shares a tight relationship with workflow management systems (WFM), enterprise resource planning (ERP), and total quality management (TQM).
Figures
PreviousNext
Review Article
Open Access December 06, 2023

Success Factors of Adopting Cloud Enterprise Resource Planning

Abstract The technologies for cloud ERP (Enterprise Resource Planning) have revolutionized the field of information technologies. Any kind of business can benefit from their flexibility, affordability, scalability, adaptation, availability, and customizable data. An advancement of classic ERP, cloud enterprise resource planning (C-ERP) provides the benefits of cloud computing (CC), including resource [...] Read more.
The technologies for cloud ERP (Enterprise Resource Planning) have revolutionized the field of information technologies. Any kind of business can benefit from their flexibility, affordability, scalability, adaptation, availability, and customizable data. An advancement of classic ERP, cloud enterprise resource planning (C-ERP) provides the benefits of cloud computing (CC), including resource elasticity and ease of use. The rise of cloud computing affects on-premise ERP systems in terms of architecture and cost. Cloud-based ERP systems make the claim to be appropriate for digital corporate settings. System quality, security, vendor lock-in, and data accessibility are recognized as the technological issues. Industry 4.0 refers to the re-engineering and revitalization of modern factories through the integration of cloud-based operations, industrial internet connectivity, additive manufacturing, and cybersecurity platforms. One of the four main pillars of Industry 4.0, cloud-based Enterprise Resource Planning (Cloud ERP), is a component of cloud operations that aids in achieving greater standards of sustainable performance.
Figures
PreviousNext
Review Article
Open Access December 03, 2023

Evolution of Enterprise Applications through Emerging Technologies

Abstract The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various [...] Read more.
The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various industries. Grasping the concept of artificial intelligence and its application in diverse business applications is crucial, given its broad and intricate nature. The primary focus of this paper is to delve into the realm of artificial intelligence and its utilization within enterprise resource planning. The study not only explores artificial intelligence but also delves into related concepts such as machine learning, deep learning, and neural networks in greater detail. Drawing upon existing literature, this research examines various books and online resources discussing the intersection of artificial intelligence and ERP. The findings reveal that the impact of AI is evident as businesses attain heightened levels of analytical efficiency across different ERP domains, thanks to remarkable advancements in AI, machine learning, and deep learning. Artificial intelligence is extensively employed in numerous ERP areas, with a particular emphasis on customer support, predictive analysis, operational planning, and sales projections.
Review Article
Open Access February 23, 2023

Substituting Intelligence

Abstract The development of ChatGPT is a topical subject of reflection. This short paper focuses on the (possible) use of ChatGPT in academia and some of its (possible) ramifications for users’ cognitive abilities and, dramatically put, their existence.
The development of ChatGPT is a topical subject of reflection. This short paper focuses on the (possible) use of ChatGPT in academia and some of its (possible) ramifications for users’ cognitive abilities and, dramatically put, their existence.
Communication
Open Access December 14, 2022

Applying Artificial Intelligence (AI) for Mitigation Climate Change Consequences of the Natural Disasters

Abstract Climate change and weather-related disasters are speeded very fast in the last decades with the consequences bringing to humanity: insecurity, destructing the ecological systems, increasing poverty, human victims, and economical losses everywhere on the planet. The innovative methods applied to mitigate the magnitudes of natural disasters and to combat effectively their negative impact consist of [...] Read more.
Climate change and weather-related disasters are speeded very fast in the last decades with the consequences bringing to humanity: insecurity, destructing the ecological systems, increasing poverty, human victims, and economical losses everywhere on the planet. The innovative methods applied to mitigate the magnitudes of natural disasters and to combat effectively their negative impact consist of remote and earth constantly monitoring, data collection, creation of models for big data extrapolation, prediction, in-time warning for prevention, and others. Artificial intelligence (AI) is used to deal with big data, for calculations, forecasts, predictions of natural disasters in the near future, the establishment of the possibilities to escape the hazards or risky situations, as well as to prepare the human being for adverse changes, and drawing the different choices as assistance the right decision to be accepted. Many projects, programs, and frameworks are adopted and carried out the separate governments and business makers to common goals and actions for the formation of a friendly environment and measures for reducing undesired climate alterations and cataclysms. The aim of the article is to review the last programs and innovations applied in the mitigation of climate change using AI.
Figures
PreviousNext
Brief Review
Open Access July 10, 2022

Digital Therapeutics in Oncology: A Better Outlook for Cancer Patients in the Future

Abstract Digital therapeutics (DTx) is an evidence-based treatment that makes use of high-quality software. As many healthcare systems confront increasing expectations for quality results, the need for digital medications is steadily growing in the clinical arena. To ensure that patients are supported during chemotherapy and that needless hospital visits are avoided, digital therapeutics must be integrated [...] Read more.
Digital therapeutics (DTx) is an evidence-based treatment that makes use of high-quality software. As many healthcare systems confront increasing expectations for quality results, the need for digital medications is steadily growing in the clinical arena. To ensure that patients are supported during chemotherapy and that needless hospital visits are avoided, digital therapeutics must be integrated into the cancer care pathway. Oncology patients are usually immunocompromised die to their disease and treatment, rendering them more susceptible to infection than the general population. As a result, visiting to a hospital might endanger their health. In addition, when cancer patients and survivors return home after treatment, digital health interventions provide them with the tools they need to manage their illness and its side effects in the privacy of their own homes. Considering the increasing prevalence of cancer patients and the solution that digital therapeutics has to offer in oncology, its future looks promising. This review article aims to summarize the existing companies in this domain, while evaluating the prospects as well.
Review Article
Open Access December 27, 2021

A Comparative Study for Recommended Triage Accuracy of AI Based Triage System MayaMD with Indian HCPs

Abstract Artificial intelligence (AI) based triage and diagnostic systems are increasingly being used in healthcare. Although these online tools can improve patient care, their reliability and accuracy remain variable. We hypothesized that an artificial intelligence (AI) powered triage and diagnostic system (MayaMD) would compare favorably with human doctors with respect to triage and diagnostic accuracy. [...] Read more.
Artificial intelligence (AI) based triage and diagnostic systems are increasingly being used in healthcare. Although these online tools can improve patient care, their reliability and accuracy remain variable. We hypothesized that an artificial intelligence (AI) powered triage and diagnostic system (MayaMD) would compare favorably with human doctors with respect to triage and diagnostic accuracy. We performed a prospective validation study of the accuracy and safety of an AI powered triage and diagnostic system. Identical cases were evaluated by an AI system and individual Indian healthcare practitioners (HCPs) to draw comparison for accuracy and safety. The same cases were validated with the help of consensus received from an expert panel of 3 doctors. These cases in the form of clinical vignettes were provided by an expert medical team. Overall, the study showed that the MayaMD AI based platform for virtual triage was able to recommend the most appropriate triage ensuring patient safety. In fact, the accuracy of triage recommendation by MayaMD was significantly better than that provided by individual HCPs (74% vs. 91.67%, p=0.04) with consensus being used as standard.
Figures
PreviousNext
Article
Open Access October 19, 2021

A Ligthweight Wayfinding Assistance System for IoT Applications

Abstract In this paper, we propose to design an indoor sign detection system for industry 4.0. In order to implement the proposed system, we proposed a lightweight deep learning-based architecture based on MobileNet which can be run on embedded devices used to detect and recognize indoor landmarks signs in order to assist blind and sighted during indoor navigation. We apply various operations in order to [...] Read more.
In this paper, we propose to design an indoor sign detection system for industry 4.0. In order to implement the proposed system, we proposed a lightweight deep learning-based architecture based on MobileNet which can be run on embedded devices used to detect and recognize indoor landmarks signs in order to assist blind and sighted during indoor navigation. We apply various operations in order to minimize the network size as well as computation complexity. Internet of things (IoT) presents a connection between internet and the surroundings objects. IoT is characterized to connect physical objects with their numerical identities and enables them to connect with each other. This technique creates a kind of bridge between the physical world and the virtual world. The paper provides a comprehensive overview of a new method for a set of landmark indoor sign objects based on deep convolutional neural network (DCNN) for internet of things applications.
Figures
PreviousNext
Article
Open Access October 17, 2021

Understanding Traffic Signs by an Intelligent Advanced Driving Assistance System for Smart Vehicles

Abstract Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a [...] Read more.
Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a huge number of sensors and processing units to provide a complete overview of the surrounding objects to the driver. In this paper, we introduce a road signs classifier for an ADAS to recognize and understand traffic signs. This classifier is based on a deep learning technique, and, in particular, it uses Convolutional Neural Networks (CNN). The proposed approach is composed of two stages. The first stage is a data preprocessing technique to filter and enhance the quality of the input images to reduce the processing time and improve the recognition accuracy. The second stage is a convolutional CNN model with a skip connection that allows passing semantic features to the top of the network in order to allow for better recognition of traffic signs. Experiments have proved the performance of the CNN model for traffic sign classification with a correct recognition rate of 99.75% on the German traffic sign recognition benchmark GTSRB dataset.
Figures
PreviousNext
Article
Open Access September 04, 2021

Active Fault Tolerant Control of Faulty Uncertain Neutral Time-Delay Systems

Abstract The present paper attempts to investigate the problem of Fault Tolerant Control for a class of uncertain neutral time delay systems. In the first time, we consider an additive control that is based on adding a term to the nominal law when the fault occurs. This approach will be designed in three steps. The first step is fault detection while the second one is fault estimation. For these two steps, [...] Read more.
The present paper attempts to investigate the problem of Fault Tolerant Control for a class of uncertain neutral time delay systems. In the first time, we consider an additive control that is based on adding a term to the nominal law when the fault occurs. This approach will be designed in three steps. The first step is fault detection while the second one is fault estimation. For these two steps, we consider the adaptive observer to guarantee the detection and estimation of the fault. The third step is the fault compensation. Lyapunov method and Linear Matrix Inequality (LMI) techniques were considered to improve the main method. Second, we propose a Pseudo Inverse Method "PIM" and determine the error between the closed loop and the nominal system. Finally, simulation results are presented to prove the theoretical development for an example of an uncertain neutral time delay system.
Figures
PreviousNext
Article
Open Access July 23, 2021

Behavioral Economics and Energy Consumption: Behavioral Data Analysis the Role of Attitudes and Beliefs on Household Electricity Consumption in Iran

Abstract The average electricity consumption in Iranian households is higher than the world average. This can be due to price factors (such as cheap electricity in the country) and non-price factors (such as socio-demographic variables and psychological factors). In this study, non-price factors such as socio-demographic variables and psychological factors in the electricity consumption of urban households [...] Read more.
The average electricity consumption in Iranian households is higher than the world average. This can be due to price factors (such as cheap electricity in the country) and non-price factors (such as socio-demographic variables and psychological factors). In this study, non-price factors such as socio-demographic variables and psychological factors in the electricity consumption of urban households in Tehran were investigated. In this regard, using the theoretical foundations of behavioral economics and the psychology of planned behavior, this issue was analyzed. This study collected information on household electricity consumption behavior through a questionnaire and fieldwork from 2560 Tehran households. Results Using econometric techniques, linear regression was estimated, the dependent variable of which was electricity consumption (45 days in winter 2019) and its independent variables including socio-demographic variables (age, sex, number of household members, income) and The variables of the theory of planned behavior (attitude, mental norms and perceived behavioral control) showed that income and the number of household members have a significant and positive effect on electricity consumption, but gender has no significant effect. Of the psychological variables, only perceived behavioral control has a significant effect on electricity consumption. These results show that the consumer does not have a positive attitude towards saving, and mental and social norms do not encourage him to reduce electricity consumption, and they are not effective in consumption control. Finally, the study results were analyzed using behavioral biases that may cause attitudes and beliefs not to lead to action.
Figures
PreviousNext
Article
Open Access December 27, 2021

Leveraging AI and ML for Enhanced Efficiency and Innovation in Manufacturing: A Comparative Analysis

Abstract The manufacturing industry has embraced modern technologies such as big data, machine learning, and artificial intelligence. This paper examines AI and machine learning developments in the manufacturing industry, comparing current practices and data-driven projects. It aims better to understand these technologies and their potential benefits and challenges. The research identifies opportunities [...] Read more.
The manufacturing industry has embraced modern technologies such as big data, machine learning, and artificial intelligence. This paper examines AI and machine learning developments in the manufacturing industry, comparing current practices and data-driven projects. It aims better to understand these technologies and their potential benefits and challenges. The research identifies opportunities for innovative business solutions and explores industry practices and research results. The paper focuses on implementation rather than technical aspects, aiming to enhance knowledge in this area.
Figures
PreviousNext
Review Article
Open Access August 20, 2022

Advancing Predictive Failure Analytics in Automotive Safety: AI-Driven Approaches for School Buses and Commercial Trucks

Abstract The recent evidence on AI in automotive safety shows the potential to reduce crashes and improve efficiency. Studies used AI techniques like machine learning and predictive analytics models to develop predictive collision avoidance systems. The studies collected data from various sources, such as traffic collision data and shapefiles. They utilized deep learning neural networks and 3D [...] Read more.
The recent evidence on AI in automotive safety shows the potential to reduce crashes and improve efficiency. Studies used AI techniques like machine learning and predictive analytics models to develop predictive collision avoidance systems. The studies collected data from various sources, such as traffic collision data and shapefiles. They utilized deep learning neural networks and 3D visualization techniques to analyze the data. However, there needs to be more research on AI in school bus and commercial truck safety. This paper explores the importance of AI-driven predictive failure analytics in enhancing automotive safety for these vehicles. It will discuss challenges, required data, technologies involved in predictive failure analytics, and the potential benefits and implications for the future. The conclusion will summarize the findings and emphasize the significance of AI in improving driver safety. Overall, this paper contributes to the field of automotive safety and aims to attract more research in this area.
Figures
PreviousNext
Review Article
Open Access August 29, 2022

From Deterministic to Data-Driven: AI and Machine Learning for Next-Generation Production Line Optimization

Abstract The advancement of modern manufacturing is synonymous with the growth of automation. Automation replaces human operators, improves productivity and quality, and reduces costs. However, the initial financial cost and knowledge requirements can be barriers to embracing automation. Manufacturers are now seeking smart manufacturing, known as the fourth industrial revolution. Smart manufacturing goes [...] Read more.
The advancement of modern manufacturing is synonymous with the growth of automation. Automation replaces human operators, improves productivity and quality, and reduces costs. However, the initial financial cost and knowledge requirements can be barriers to embracing automation. Manufacturers are now seeking smart manufacturing, known as the fourth industrial revolution. Smart manufacturing goes beyond automation and utilizes IoT, AI, and big data for optimized production. In a smart factory, production can be linked and controlled innovatively, leading to increased performance, agility, and reduced costs.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Exploring AI Algorithms for Cancer Classification and Prediction Using Electronic Health Records

Abstract Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer [...] Read more.
Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer detection, utilizing the publicly available CBIS-DDSM dataset, which comprises 5,000 images evenly divided between benign and malignant cases. To improve diagnostic accuracy, models such as Gaussian Naïve Bayes (GNB), CNNs, KNN, and MobileNetV2 were assessed employing performance measures including F1-score, recall, accuracy, and precision. The methodology involved data preprocessing techniques, including transfer learning and feature extraction, followed by data splitting for robust model training and evaluation. Findings indicate that MobileNetV2 achieved a highest accuracy99.4%, significantly outperforming GNB (87.2%), CNN (96.7%), and KNN (91.2%). The outstanding capacity of MobileNetV2 to identify between benign and malignant instances was shown by the investigation, which also made use of confusion matrices and ROC curves to evaluate model performance.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

An Effective Predicting E-Commerce Sales & Management System Based on Machine Learning Methods

Abstract Due to influence of Internet, this e-commerce sector has developed rapidly. Most of the online retailing or selling businesses are seeking for way for predicting their products demand. Sales forecasting may help retailers develop a sales strategy that will enhance sales and attract more money and investment. The current research work puts forward a machine learning framework to forecast E-commerce [...] Read more.
Due to influence of Internet, this e-commerce sector has developed rapidly. Most of the online retailing or selling businesses are seeking for way for predicting their products demand. Sales forecasting may help retailers develop a sales strategy that will enhance sales and attract more money and investment. The current research work puts forward a machine learning framework to forecast E-commerce sales for strategic management using a dataset of E-commerce transactions. With 70 percent of the data for train and 30 percent for test, three models were produced, namely, Random Forest, Decision Tree, and XGBoost. In order to evaluate the models, performance measures inclusive of R-squared (R²) and Root Mean Squared Error (RMSE) were employed. Thus, the XGBoost model was the most accurate in marketing predictive capabilities for E-commerce sales with the R² score of 96.3%. This has demonstrated the increased capability of XGBoost algorithm to forecast E-commerce monthly sales more accurately than other models and can assist decision makers for managing inventory and arriving smart and quick decisions in this rapidly growing E-commerce market. The findings reiterate the importance of using advanced analytics in order to drive effectiveness and customer experience within E-commerce sector.
Figures
PreviousNext
Review Article
Open Access October 15, 2022

Big Data and AI/ML in Threat Detection: A New Era of Cybersecurity

Abstract The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even [...] Read more.
The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even under pressure from regulatory boards, have strived to harness the power of data and leverage it to enhance safety and security, maximize performance, and mitigate risks. However, the adversaries themselves have capitalized on the unequal battle of big data and artificial intelligence to inflict widespread chaos. Therefore, the demand for big data analytics and AI/ML for high-fidelity intelligence, surveillance, and reconnaissance is at its highest. Today, in the cybersecurity realm, the detection of adverse incidents poses substantial challenges due to the sheer variety, volume, and velocity of deep packet inspection data. State-of-the-art detection techniques have fallen short of detecting the latest attacks after a big data breach incident. On the other hand, computational intelligence techniques such as machine learning have reignited the search for solutions for diverse monitoring problems. Recent advancements in AI/ML frameworks have the potential to analyze IoT/edge-generated big data in near real-time and assist risk assessment and mitigation through automated threat detection and modeling in the big data and AI/ML domain. Industry best practices and case studies are examined that endeavor to showcase how big data coupled with AI/ML unlocks new dimensions and capabilities in improved vigilance and monitoring, prediction of adverse incidents, intelligent modeling, and future uncertainty quantification by data resampling correction. All of these avenues lead to enhanced robustness, security, safety, and performance of industrial processes, computing, and infrastructures. A view of the future and how the potential threats due to the misuse of new technologies from bandwidth to IoT/edge, blockchain, AI, quantum, and autonomous fields is discussed. Cybersecurity is again playing out at a pace set by adversaries with low entry barriers and debilitating tools. The need for innovative solutions for defense from the emerging threat landscape, harnessing the power of new technologies and collaboration, is emphasized.
Figures
PreviousNext
Article
Open Access December 27, 2021

Leveraging AI in Urban Traffic Management: Addressing Congestion and Traffic Flow with Intelligent Systems

Abstract Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. [...] Read more.
Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. From an urban transportation standpoint, an immediate consideration on one hand is monitoring traffic conditions and demand cycles, while on the other hand inducing flow modifications that benefit the traffic network and mitigate congestion. Embedded and centralized control systems that characterize modern traffic management systems extract traffic conditions specific to their regions but lack communication between networks. Moreover, innovative methods are required to provide more accurate up-to-date traffic forecasts that characterize real-world traffic dynamics and facilitate optimal traffic management decisions. In this chapter, we briefly outline the main difficulties and complexities in modeling, managing, and forecasting traffic dynamics. We also compare various conventional and modern Intelligent Transportation Strategies in terms of accuracy and applicability, their performance, and potential opportunities for optimization of multimodal traffic flow and congestion reduction. This chapter introduces various proposed data-driven models and tools employed for traffic flow prediction and management, investigating specific strategies' strengths, weaknesses, and benefits in addressing various real-world traffic management problems. We describe that the design phase of dependable Intelligent Transportation Systems bears unique requirements in terms of the robustness, safety, and response times of their components and the encompassing system model. Furthermore, this architectural blueprint shares similarities with distributed coordinate searching and collective adaptive systems. Town size-independent models induce systemic performance improvements through reconfigurable embedded functionality. These AI techniques feature elaborate anytime planner-engagers ensuring near-optimal performances in an unbiased behavior when the model complexity is varied. Sustainable models minimize congestion during peaks, flooding, and emergency occurrences as they adhere to area-specific regulations. Security-aware and fail-safe traffic management systems relinquish reasonable assurances of persistent operation under various environmental settings, to acknowledge metropolis and complex traffic junctions. The chapter concludes by outlining challenges, research questions, and future research paths in the field of transportation management.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Sustainability in Construction: Exploring the Development of Eco-Friendly Equipment

Abstract The equipment used in the construction industry is usually associated with a high impact on the environment. Although sustainable design has shown to be a main player among the initiatives focused on reducing environmental impact, it has been driven by the workers and processes, leaving the equipment endeavors in more restrictive and later stages. The equipment industry has been a constant target [...] Read more.
The equipment used in the construction industry is usually associated with a high impact on the environment. Although sustainable design has shown to be a main player among the initiatives focused on reducing environmental impact, it has been driven by the workers and processes, leaving the equipment endeavors in more restrictive and later stages. The equipment industry has been a constant target of environmental standards and economic pressure, but the increasing technological development allows it to respond to sustainability and safety expectations while enhancing its performance. However, there are still several limitations that lead this sector to be one of the last to reach upgrading levels in terms of development. A study identified some gaps in the equipment design that require a greater effort to effectively support the workers and companies towards sustainable construction. This chapter is based on a study aiming to understand the consolidated knowledge of technologically sustainable equipment design and to identify the challenges left for its full development. The findings support the development of innovative eco-friendly equipment, taking into consideration sustainable materials and product guidelines, as well as green economy initiatives. It also supports complex system approaches and safety by design specificities to establish a corporate knowledge of sustainable equipment and align it with the new regulations of the construction industry. The chapter introduces the context of construction equipment in terms of new challenges when faced with the need to provide construction work with a greater capacity for safety, from an environmental and energy efficiency perspective, and within the paradigm of sustainability. Then, it presents the concept of sustainable equipment considering its principles, followed by a characterization of the agents involved in its life cycle.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Financial Implications of Predictive Analytics in Vehicle Manufacturing: Insights for Budget Optimization and Resource Allocation

Abstract Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented [...] Read more.
Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented questions need to at least partially guide the decisions in the planning phase of data science projects. Data-driven approaches will play an increasingly important role, but only a few of the firms that were confident performed logistic regression models for predictive maintenance. Also, from the available knowledge, data-driven classification models connecting vehicle component failures and the occurrence of delays at the assembly line have not been published. This paper utilizes a real-world data-driven approach using classification models in predictive analytics by vehicle manufacturers and thereby links the financial implications of such data science projects to their results. We expand the existing literature on predictive maintenance and possess a unique dataset of newly launched series of vehicles, presented as-is. Our research context is of interest to researchers and practitioners in the automotive industry that manage and plan the final vehicle assembly with just-in-time principles, factoring the consequences of component failures on the assembly process. Key findings of this paper highlight that while minor tweaking of the models is possible, their potential input in decision-making processes for budget optimization is limited.
Figures
PreviousNext
Review Article
Open Access October 29, 2022

Neural Networks for Enhancing Rail Safety and Security: Real-Time Monitoring and Incident Prediction

Abstract The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the [...] Read more.
The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the improvements in monitoring can be achieved using artificial intelligence algorithms such as neural networks. Neural networks have been used to achieve real-time incident identification in monitoring the track quality in terms of classifying the graphical outputs of an ultrasonic system working with the rails and track bed, to predict incidents on the rail infrastructure due to transmission channels becoming blocked, and also to attempt scheduling preemptive and preventative maintenance. In terms of forecasting incidents and accidents on board the trains, neural networks have been used to model passenger behavior and optimize responses during a train station evacuation. In tackling the incidents and accidents occurring on rail transport, we contribute with two methodologies to detect anomalies in real-time and identify the level of security risk: at the maintenance level with personnel operating along the railways, and onboard passenger trains. These methodologies were evaluated on real-world datasets and shown to be able to achieve a high accuracy in the results. The results generated from these case studies also reveal the potential for network-wide applications, which could enhance security and safety on railway networks by offering the possibility of better managing network disruptions and more rapidly identifying security issues. The speed and coverage of the information generated through the implementation of these methodologies have implications in utilizing prediction for decision support and enhancing safety and security on board the rail network.
Figures
PreviousNext
Review Article
Open Access November 16, 2023

Innovations in Agricultural Machinery: Assessing the Impact of Advanced Technologies on Farm Efficiency

Abstract Progress in the development and adoption of technological innovations is instrumental in enhancing the efficiency of production systems across the globe. Through the introduction of cost-efficient and high-performing technologies, countries can both reduce the resource use intensity of their economies and boost the global supply of essential products. The focus of this study is to analyze the [...] Read more.
Progress in the development and adoption of technological innovations is instrumental in enhancing the efficiency of production systems across the globe. Through the introduction of cost-efficient and high-performing technologies, countries can both reduce the resource use intensity of their economies and boost the global supply of essential products. The focus of this study is to analyze the application of advanced machinery and mechanisms within the agricultural sector, a primary industry that acts as a major contributor to the gross domestic product (GDP) of many nations. Specifically, this paper provides an in-depth review of the latest impact assessments based on analytical and modeling tools conducted on agricultural machinery and production technologies. Our findings highlight the positive role played by scientific progress and innovation in driving the competitiveness, growth and improved sustainability of the agricultural sector. Over the years, advanced technologies have accelerated the development and modernization of machinery, equipment, and processes in farming. Typically, modern machinery and equipment have enabled large-scale production on farms, enhancing the cost-efficient use of both land and labor, as well as the capacity and timeliness in performing essential agricultural operations. The rapid diffusion of technical advancements has further contributed to resource savings, productivity growth, and the overall transformation of agricultural value chains. Accordingly, the implementation of appropriate enabling conditions is of vital importance in encouraging the widespread integration of technologies in agriculture, not only boosting productivity along the agri-food chain but also yielding widespread social, economic, and environmental benefits.
Figures
PreviousNext
Review Article
Open Access October 30, 2022

Towards Autonomous Analytics: The Evolution of Self-Service BI Platforms with Machine Learning Integration

Abstract Self-service business intelligence (BI) platforms have become essential applications for exploring, analyzing, and visualizing business data in various domains. Here, we envisage that the business intelligence platform will perform automatic and autonomous data analytics with minimal to no user interaction. We aim to offer a data-driven, intelligent, and scalable infrastructure that amplifies the [...] Read more.
Self-service business intelligence (BI) platforms have become essential applications for exploring, analyzing, and visualizing business data in various domains. Here, we envisage that the business intelligence platform will perform automatic and autonomous data analytics with minimal to no user interaction. We aim to offer a data-driven, intelligent, and scalable infrastructure that amplifies the advantages of BI systems and discovers hidden and complex insights from very large business datasets, which a business analyst can miss during manual exploratory data analysis. Towards our future vision of autonomous analytics, we propose a collective machine learning model repository with an integration layer for user-defined analytical goals within the BI platform. The proposed architecture can effectively reduce the cognitive load on users for repetitive tasks, democratizing data science expertise across data workers and facilitating a less experienced user community to develop and use advanced machine learning and statistical algorithms.
Figures
PreviousNext
Review Article
Open Access November 05, 2022

Application of Neural Networks in Optimizing Health Outcomes in Medicare Advantage and Supplement Plans

Abstract The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, [...] Read more.
The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, personalized treatment recommendations, and risk stratification. This paper explores the application of neural networks in enhancing health outcomes within the context of Medicare Advantage and Supplement plans. We review how deep learning models can be leveraged to predict patient risk, optimize resource allocation, and identify at-risk populations for preventive interventions. Additionally, we discuss the potential for neural networks to improve claims processing, reduce fraud, and streamline administrative burdens. By integrating various data sources, including medical records, claims data, and demographic information, neural networks enable more accurate and efficient decision-making processes. Ultimately, this approach can lead to better patient care, reduced healthcare costs, and improved satisfaction for beneficiaries of these programs. The paper concludes by highlighting the current limitations, ethical considerations, and future directions for AI adoption in the Medicare Advantage and Supplement sectors.
Figures
PreviousNext
Review Article
Open Access November 16, 2023

Zero Carbon Manufacturing in the Automotive Industry: Integrating Predictive Analytics to Achieve Sustainable Production

Abstract This charge-ahead paper suggests that transitioning the automotive industry towards a zero-carbon ecosystem from material to end-of-life can be accomplished through disruptive zero-carbon manufacturing in the broad area of all-electric vehicle production technology. To accomplish zero carbon emission automotive manufacturing in the vehicle assembly domain, future paradigms must converge on the [...] Read more.
This charge-ahead paper suggests that transitioning the automotive industry towards a zero-carbon ecosystem from material to end-of-life can be accomplished through disruptive zero-carbon manufacturing in the broad area of all-electric vehicle production technology. To accomplish zero carbon emission automotive manufacturing in the vehicle assembly domain, future paradigms must converge on the decoupling of carbon dioxide emissions from automobile manufacturing and use the design, processing, and manufacturing conditions. The envisioned zero carbon emission vehicle manufacturing domain consists of two complementary components: (a) making more efficient use of energy and (b) reducing carbon in energy use. This paper presents the status of key scientific and technological advancements to bring the manufacturing model of today to a zero-carbon ecosystem for the entire automotive industry of tomorrow. This paper suggests the groundbreaking application of dynamic and distributed predictive scheduling algorithms and open sensing and visualization technology to meet the zero carbon emission vehicle manufacturing goals. Power-aware high-performance computing clusters have recently become a viable solution for sustainable production. Advances in scalable and self-adaptive monitoring, predictive analytics, timeline-based machine learning, and digital replica of cyber-physical systems are also seen co-evolving in the zero carbon manufacturing future. These methods are inspired by initiatives to decouple gross domestic product growth and energy-related carbon dioxide emissions. Stakeholders could co-design and implement shared roadmaps to transition the automotive manufacturing sector with relevant societal and environmental benefits. The automated mobility sector offers a program, an industry-leading example of transforming an automotive production facility to carbon neutrality status. The conclusions from this paper challenge automotive manufacturers to engage in industry offsetting and carbon tax programs to drive continuous improvement and circular vehicle flows via a multi-directional zero-carbon smart grid.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Enhancing Pharmaceutical Supply Chain Efficiency with Deep Learning-Driven Insights

Abstract The growing complexity of the operating environment urges pharmaceutical innovation. This essay addresses the need for the integration of advanced technologies in the pharmaceutical supply chain. It justifies the value proposition and presents a concrete use case for the integration of deep learning insights to make data-driven decisions. The supply chain has always been a priority for the [...] Read more.
The growing complexity of the operating environment urges pharmaceutical innovation. This essay addresses the need for the integration of advanced technologies in the pharmaceutical supply chain. It justifies the value proposition and presents a concrete use case for the integration of deep learning insights to make data-driven decisions. The supply chain has always been a priority for the pharmaceutical industry; research and development recognizes companies' increasing investment in big data strategies, with plans for a CAGR in big data tool adoption. The work presented herein has a preliminary explorative character to recuperate and integrate evidence from partly overlooked practical experience and know-how. The practical relevance of the essay is directed toward practitioners in pharmaceutical production, supply chain management, logistics, and regulatory agencies. The literature has shown a long-term concern for enhanced performance in the pharmaceutical supply chain network. This essay demonstrates the application of deep learning-driven insights to reveal non-evident flow dependencies. The main aim is to present a comprehensive insight into deep learning-driven decision support. The supply chain is portrayed in a holistic manner, seeking end-to-end visibility. Implications for public policy are discussed, such as data equity: many countries are protecting their populations and economic growth by building resilience and efficiency to ensure the capacity to move goods across supply chains. The implementation strategy is covered. The combined reduction of variability, efficiency as matured richness, reliability (on stochastic flows and their understanding through deep learning and data), and system noise (increased dampening through the inclusiveness of all stakeholders) results in increased responsiveness of supply chains for pharmaceutical products. Future work involves the integration of external data, closing the loop between planning and its application in reality.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Predictive Analytics and Deep Learning for Logistics Optimization in Supply Chain Management

Abstract Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the [...] Read more.
Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the operations of a supply chain. An approach is presented on how predictive analytics can be used to improve logistics operations. In order to analyze big data in logistics effectively, an artificial intelligence computational technique, specifically deep learning, is employed. Two case studies are illustrated to demonstrate the practical employability of the proposed technique. This reveals the power and potential of using predictive analytics in logistics to project various KPI values ahead in the future based on the contemporary data from the logistics operations; sheds light on the innovative technique of employing deep learning through deep learning-based predictive analytics in logistics; suggests incorporating innovative techniques like deep learning with predictive analytics to develop an accurate forecasting technique in logistics and optimize operations and prevent disruption in the supply chain. The network of supply chains has become more complex, necessitating the need for the latest technological advancements. The sectors that have gained a fair amount of attention for the application of technology to optimize their operations are manufacturing, healthcare, aerospace, and the automotive industry. A little attention has been diverted to the logistics sector; many describe how analytics and artificial intelligence can be used in the logistics sector to achieve higher optimization. Currently, significant research has been done in optimizing logistics operations. Nevertheless, with the explosive volume of historical data being produced by the logistics operations of an organization, there is a great opportunity to learn valuable insights from the data accumulated over time for more long-term strategic planning. To develop the logistics operations in an organization, the use of historical data is essential to understand the trends in the operations. For example, regular maintenance planning and resource allocation based on trends are long-term activities that will not affect logistics operations immediately but can affect the business’s strategic planning in the long run. A predictive analysis technique employed on historical data of logistics can narrow down conclusions based on the future trends of logistics operations. Thus, the technique can be used to prevent the disruption of the supply chain.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

Advancing Pain Medicine with AI and Neural Networks: Predictive Analytics and Personalized Treatment Plans for Chronic and Acute Pain Managements

Abstract There is a growing body of evidence that the number of individuals suffering from chronic and acute pain is under-reported and the burden of the veteran, aging, athletic, and working populations is rising. Current pain management is limited by our capacity to collaborate with individuals continuing normal daily functions and self-administration of pain treatments outside of traditional healthcare [...] Read more.
There is a growing body of evidence that the number of individuals suffering from chronic and acute pain is under-reported and the burden of the veteran, aging, athletic, and working populations is rising. Current pain management is limited by our capacity to collaborate with individuals continuing normal daily functions and self-administration of pain treatments outside of traditional healthcare appointments and hospital settings. In this review, the current gap in clinical care for real-time feedback and guidance with pain management decision-making for chronic and post-operative pain treatment is defined. We examine the recent and future applications for predictive analytics of opioid use after surgery and implementing real-time neural networks for personalized pain management goal setting for particular individuals on the path to discharge to normal function. Integration of personalized neural networks with longitudinal data may enable the development of future treatment personalizations paired with electrical simulations.
Figures
PreviousNext
Review Article
Open Access December 27, 2023

Leveraging Artificial Intelligence to Enhance Supply Chain Resilience: A Study of Predictive Analytics and Risk Mitigation Strategies

Abstract The management of supply chains is increasingly complex. This study provides a comparative analysis of the cost-benefit analysis for managing various risks. It identifies the financial implications of leveraging artificial intelligence in supply chains to better address risk. Empirical results show a business case for managing some sources of risk more proactively facilitated through predictive [...] Read more.
The management of supply chains is increasingly complex. This study provides a comparative analysis of the cost-benefit analysis for managing various risks. It identifies the financial implications of leveraging artificial intelligence in supply chains to better address risk. Empirical results show a business case for managing some sources of risk more proactively facilitated through predictive modeling techniques offered by AI. Across investigation streams, the use of AI results in an average total cost saving ranging from 41,254 to 4,099,617. Findings from our research can be used to inform managers and theorists about the implications of integrating AI technologies to manage risks in the supply chain. Our work also highlights areas for future research. Given the growing interest in studying sub-second forecasting, our research could be a point of departure for future investigations aimed at considering the impact of forecasting horizons such as an intra-day basis. We formulate a conceptual framework that considers how and to what extent performance evaluation metrics vary according to differences in the fidelity of predictive models and factor importance for identifying risks. We also utilize a mixed-method approach to demonstrate the applicability of our ideas in practice. Our results illustrate the financial implications of integrating AI predictive tools with business processes. Results suggest that real-world companies can circumvent inefficiencies associated with trying to manage many classes of risk via the use of AI-enhanced predictive analytics. As managers need to justify investment to top management, our work supports decision-making by providing a means of conducting a trade-off analysis at the tactical level.
Figures
PreviousNext
Review Article

Query parameters

Keyword:  Big data

View options

Citations of

Views of

Downloads of