Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access February 06, 2026

Predictive Modeling of Public Sentiment Using Social Media Data and Natural Language Processing Techniques

Abstract Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled [...] Read more.
Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled tweets, and develops predictive models for binary sentiment classification using Naive Bayes, Logistic Regression, and the transformer-based BERT model. Experiments were conducted on a balanced subset of 12,000 tweets after comprehensive NLP preprocessing. Evaluation using accuracy, F1-score, and confusion matrices revealed that BERT significantly outperforms traditional models, achieving an accuracy of 89.5% and an F1-score of 0.89 by effectively modeling contextual and semantic nuances. In contrast, Naive Bayes and Logistic Regression demonstrated reasonable but consistently lower performance. To support practical deployment, we introduce SentiFeel, an interactive tool enabling real-time sentiment analysis. While resource constraints limited the dataset size and training epochs, future work will explore full corpus utilization and the inclusion of neutral sentiment classes. These findings underscore the potential of transformer models for enhanced public opinion monitoring, marketing analytics, and policy forecasting.
Figures
PreviousNext
Article
Open Access January 11, 2025

Exploring LiDAR Applications for Urban Feature Detection: Leveraging AI for Enhanced Feature Extraction from LiDAR Data

Abstract The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is [...] Read more.
The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is crucial for enhancing urban development, environmental monitoring, and advancing smart city governance. LiDAR, known for its high-resolution 3D data capture capabilities, paired with AI, particularly deep learning algorithms, facilitates advanced analysis and interpretation of urban areas. This combination supports precise mapping, real-time monitoring, and predictive modeling of urban growth and infrastructure. For instance, AI can process LiDAR data to identify patterns and anomalies, aiding in traffic management, environmental oversight, and infrastructure maintenance. These advancements not only improve urban living conditions but also contribute to sustainable development by optimizing resource use and reducing environmental impacts. Furthermore, AI-enhanced LiDAR is pivotal in advancing autonomous navigation and sophisticated spatial analysis, marking a significant step forward in urban management and evaluation. The reviewed paper highlights the geometric properties of LiDAR data, derived from spatial point positioning, and underscores the effectiveness of machine learning algorithms in object extraction from point clouds. The study also covers concepts related to LiDAR imaging, feature selection methods, and the identification of outliers in LiDAR point clouds. Findings demonstrate that AI algorithms, especially deep learning models, excel in analyzing high-resolution 3D LiDAR data for accurate urban feature identification and classification. These models leverage extensive datasets to detect patterns and anomalies, improving the detection of buildings, roads, vegetation, and other elements. Automating feature extraction with AI minimizes the need for manual analysis, thereby enhancing urban planning and management efficiency. Additionally, AI methods continually improve with more data, leading to increasingly precise feature detection. The results indicate that the pulse emitted by continuous wave LiDAR sensors changes when encountering obstacles, causing discrepancies in measured physical parameters.
Figures
PreviousNext
Article
Open Access January 10, 2025

Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence

Abstract Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a [...] Read more.
Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS's transformative potential across diverse computational fields.
Figures
PreviousNext
Article
Open Access September 13, 2023

A Comparative Study of Attention-Based Transformer Networks and Traditional Machine Learning Methods for Toxic Comments Classification

Abstract With the rapid growth of online communication platforms, the identification and management of toxic comments have become crucial in maintaining a healthy online environment. Various machine learning approaches have been employed to tackle this problem, ranging from traditional models to more recent attention-based transformer networks. This paper aims to compare the performance of attention-based [...] Read more.
With the rapid growth of online communication platforms, the identification and management of toxic comments have become crucial in maintaining a healthy online environment. Various machine learning approaches have been employed to tackle this problem, ranging from traditional models to more recent attention-based transformer networks. This paper aims to compare the performance of attention-based transformer networks with several traditional machine learning methods for toxic comments classification. We present an in-depth analysis and evaluation of these methods using a common benchmark dataset. The experimental results demonstrate the strengths and limitations of each approach, shedding light on the suitability and efficacy of attention-based transformers in this domain.
Article
Open Access November 30, 2022

A Review of Application of LiDAR and Geospatial Modeling for Detection of Buildings Using Artificial Intelligence Approaches

Abstract Today, the presentation of a three-dimensional model of real-world features is very important and widely used and has attracted the attention of researchers in various fields, including surveying and spatial information systems, and those interested in the three-dimensional reconstruction of buildings. The building is the key part of the information in a three-dimensional city model, so extracting [...] Read more.
Today, the presentation of a three-dimensional model of real-world features is very important and widely used and has attracted the attention of researchers in various fields, including surveying and spatial information systems, and those interested in the three-dimensional reconstruction of buildings. The building is the key part of the information in a three-dimensional city model, so extracting and modeling buildings from remote sensing data is an important step in building a digital model of a city. LiDAR technology due to its ability to map in all three modes of one-dimensional, two-dimensional, and three-dimensional is a suitable solution to provide hyperspectral and comprehensive images of the building in an urban environment. In this review article, a comprehensive review of the methods used in identifying buildings from the past to the present and appropriate solutions for the future is discussed.
Figures
PreviousNext
Review Article
Open Access November 29, 2022

The Application of Machine Learning in the Corona Era, With an Emphasis on Economic Concepts and Sustainable Development Goals

Abstract The aim of this article is to examine the impacts of Coronavirus Disease -19 (Covid-19) vaccines on economic condition and sustainable development goals. In other words, we are going to study the economic condition during Covid19. We have studied the economic costs of pandemic, benefits in terms of gross domestic product (GDP), public finances and employment, investment on vaccines around the [...] Read more.
The aim of this article is to examine the impacts of Coronavirus Disease -19 (Covid-19) vaccines on economic condition and sustainable development goals. In other words, we are going to study the economic condition during Covid19. We have studied the economic costs of pandemic, benefits in terms of gross domestic product (GDP), public finances and employment, investment on vaccines around the world, progress and totally the economic impacts of vaccines and the impacts of emerging markets (EM) on achieving sustainable development goals (SDGs), including no poverty, good health and well-being, zero hunger, reduced inequality etc. The importance of emerging economies in reducing the harmful effects of the Corona has also been noted. We have tried to do experimental results and forecast daily new death cases from Feb-2020 to Aug-2021 in Iran using Artificial Neural Network (ANN) and Beetle Antennae Search (BAS) algorithm as a case study with econometric models and regression analysis. The findings show that Covid19 has had devastating economic and health effects on the world, and the vaccine can be very helpful in eliminating these effects specially in long-term. We observed that there is inequality in the distribution of Corona vaccines in rich countries compared to poor which EM can decrease the gap between them. The results show that both models (i.e., Artificial intelligence (AI) and econometric models) almost have the same results but AI optimization models can robust the model and prediction. The main contribution of this article is that we have surveyed the impacts of vaccination from socio-economic viewpoint not just report some facts and truth. We have surveyed the impacts of vaccines on sustainable development goals and the role of EM in achieving SDGs. In addition to using the theoretical framework, we have also used quantitative and empirical results that have rarely been seen in other articles.
Figures
PreviousNext
Article
Open Access April 10, 2025

Advancements in Pharmaceutical IT: Transforming the Industry with ERP Systems

Abstract The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data [...] Read more.
The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data integration, contributing significantly to operational efficiency and organizational agility. This paper explores the evolution and impact of ERP systems within the pharmaceutical sector, highlighting their contributions to overcoming the industry’s inherent challenges, including complex regulatory requirements, the need for accurate and real-time data, and the demand for supply chain resilience. The integration of cloud-based ERP solutions, the incorporation of emerging technologies like Artificial Intelligence (AI), Machine Learning (ML), and the Internet of Things (IoT), and enhanced data analytics capabilities have revolutionized pharmaceutical IT. These advancements not only reduce operational costs, improve forecasting accuracy, and enhance collaboration but also ensure compliance with stringent global regulations, such as Good Manufacturing Practices (GMP) and FDA guidelines. Moreover, ERP systems have been instrumental in managing the pharmaceutical supply chain, ensuring product traceability, and improving inventory control and order fulfillment processes. This manuscript examines how ERP systems enable pharmaceutical companies to maintain high standards of product quality, improve decision-making, and ensure the safety and efficacy of drugs through robust tracking and auditing mechanisms. A case study of a pharmaceutical company that implemented an ERP system demonstrates the tangible benefits, including increased operational efficiency, improved compliance rates, and enhanced customer satisfaction. However, despite the clear advantages, challenges such as customization complexities, data integration issues, and resistance to change remain. As the pharmaceutical industry continues to evolve, ERP systems will remain a cornerstone of digital transformation, facilitating smarter decision-making, better resource management, and enhanced collaboration across global operations. This paper also identifies future trends, including the potential of AI and blockchain technologies in further strengthening ERP systems and transforming the pharmaceutical landscape.
Review Article
Open Access February 09, 2025

The Future of Longevity Medicine from the Lens of Digital Therapeutics

Abstract Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements [...] Read more.
Digital therapeutics (DTx) are emerging as a pivotal tool in promoting longevity by addressing non-communicable diseases (NCDs) such as diabetes, cardiovascular diseases, and mental health disorders. These software-driven interventions offer personalized, evidence-based treatments that can be accessed via digital devices, making healthcare more accessible and scalable. One of the key advancements in DTx is the integration of artificial intelligence (AI) and machine learning (ML) to tailor interventions based on individual health data. This personalization enhances the effectiveness of treatments and supports preventive care by identifying risk factors early. The need for digital therapeutics is underscored by the rising prevalence of NCDs, which are responsible for a significant portion of global mortality and healthcare costs. Traditional healthcare systems often struggle to provide timely and personalized care, especially in low-resource settings. DTx can bridge this gap by offering cost-effective solutions that are easily scalable. Moreover, digital therapeutics can address health inequities by providing low-cost interventions to underserved populations, thereby reducing the burden of NCDs and improving overall health outcomes. As technology continues to evolve, the potential for DTx to enhance longevity and quality of life becomes increasingly promising. Recent advancements in longevity medicine and technology have focused on extending both lifespan and healthspan, ensuring that people not only live longer but also maintain good health throughout their extended years. This review article highlights these advancements that are contributing to this compelling subject of Longevity.
Figures
PreviousNext
Review Article
Open Access January 22, 2025

Tech Transformations: Modern Solutions for Obstructive Sleep Apnea

Abstract Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in [...] Read more.
Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in patients with a high pre-test probability of OSA. In terms of diagnosis, advancements in wearable technology and mobile health applications have enabled continuous monitoring of sleep patterns and respiratory parameters. These tools provide valuable data that can be used to identify OSA more accurately and promptly. Additionally, machine learning algorithms are being integrated into diagnostic processes to enhance the accuracy of OSA detection by analyzing large datasets and identifying patterns indicative of the condition. Management of OSA has also seen significant progress. Continuous positive airway pressure (CPAP) therapy remains the gold standard, but new developments include auto-adjusting CPAP devices that optimize pressure settings based on real-time feedback. Mandibular advancement devices and hypoglossal nerve stimulation are emerging as effective alternatives for patients who are CPAP-intolerant. Furthermore, lifestyle interventions such as weight management, positional therapy, and exercise have been shown to complement medical treatments, leading to better overall outcomes. This review article highlights these advancements that collectively contribute to improved patient adherence, reduced symptoms, and enhanced quality of life for individuals with OSA.
Review Article
Open Access January 20, 2025

Deep Learning-Based Sentiment Analysis: Enhancing IMDb Review Classification with LSTM Models

Abstract Sentiment analysis, a vital aspect of natural language processing, involves the application of machine learning models to discern the emotional tone conveyed in textual data. The use case for this type of problem is where businesses can make informed decisions based on customer feedback, identify the sentiments of their employees, and make decisions on hiring or retention, or for that matter, [...] Read more.
Sentiment analysis, a vital aspect of natural language processing, involves the application of machine learning models to discern the emotional tone conveyed in textual data. The use case for this type of problem is where businesses can make informed decisions based on customer feedback, identify the sentiments of their employees, and make decisions on hiring or retention, or for that matter, classify a text based on its topic like whether it is about a particular subject like physics or chemistry as is useful in search engines. The model leverages a sequential architecture, transforms words into dense vectors using an Embedding layer, and captures intricate sequential patterns with two Long Short-Term Memory (LSTM) layers. This model aims to effectively classify sentiments in text data using a 50-dimensional embedding dimension and 20 % dropout layers. The use of rectified linear unit (ReLU) activations enhances non-linearity, while the SoftMax activation in the output layer aligns with the multi-class nature of sentiment analysis. Both training and test accuracy were well over 80%.
Figures
PreviousNext
Article
Open Access January 09, 2025

Advances in the Synthesis and Optimization of Pharmaceutical APIs: Trends and Techniques

Abstract The synthesis and optimization of Active Pharmaceutical Ingredients (APIs) is fundamental to pharmaceutical drug development, directly influencing drug efficacy, safety, and cost-effectiveness. Over recent years, significant advancements in synthetic methodologies and manufacturing technologies have transformed API production. This manuscript provides an overview of the latest innovations in API [...] Read more.
The synthesis and optimization of Active Pharmaceutical Ingredients (APIs) is fundamental to pharmaceutical drug development, directly influencing drug efficacy, safety, and cost-effectiveness. Over recent years, significant advancements in synthetic methodologies and manufacturing technologies have transformed API production. This manuscript provides an overview of the latest innovations in API synthesis, focusing on key techniques such as green chemistry, continuous flow chemistry, biocatalysis, and automation. Green chemistry principles, including solvent substitution and catalytic reactions, have enhanced sustainability by reducing waste and energy consumption. Continuous flow chemistry offers improved reaction control, scalability, and safety, while biocatalysis provides an eco-friendly alternative for synthesizing complex and chiral APIs. Additionally, the integration of automation and advanced process control using machine learning and real-time monitoring has optimized production efficiency and consistency. The manuscript also discusses the challenges associated with regulatory compliance and quality assurance, highlighting the role of advanced analytical techniques such as HPLC, NMR, and mass spectrometry in ensuring API purity. Looking ahead, personalized medicine and smart manufacturing technologies, including blockchain for traceability, are expected to drive further innovation in API production. This review concludes by emphasizing the need for continued advancements in sustainability, efficiency, and scalability to meet the evolving demands of the pharmaceutical industry, ultimately enabling the development of safer, more effective, and environmentally responsible medicines.
Review Article
Open Access November 16, 2024

Digital Therapeutics: A New Dimension to Diabetes Mellitus Management

Abstract Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle [...] Read more.
Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabetes control. The importance of DTx lies in their ability to make diabetes care more accessible and convenient. Mobile apps and telemedicine platforms enable patients to receive support and guidance from anywhere, reducing the need for frequent in-person visits. Additionally, DTx often include behavioral support features like reminders, educational content, and motivational tools, which are crucial for maintaining healthy habits and managing stress. Currently, the dynamics of DTx in diabetes are rapidly evolving, with increasing integration of artificial intelligence and machine learning to further personalize and optimize care. As the adoption of these technologies grows, they hold the potential to significantly improve patient outcomes and revolutionize diabetes management on a global scale. This article will focus on the benefits of novel digital therapeutics for prevention and management of type II diabetes that are currently available in the market.
Figures
PreviousNext
Article
Open Access May 14, 2024

A review of reliability techniques for the evaluation of Programmable logic controller

Abstract PLCs, or programmable logic controllers, are essential parts of contemporary industrial automation systems and are responsible for managing and keeping an eye on a variety of operations. PLC reliability is critical to maintaining industrial systems' continuous and secure operation. A wide range of reliability strategies were used to improve the reliability of Programmable Logic Controllers, and [...] Read more.
PLCs, or programmable logic controllers, are essential parts of contemporary industrial automation systems and are responsible for managing and keeping an eye on a variety of operations. PLC reliability is critical to maintaining industrial systems' continuous and secure operation. A wide range of reliability strategies were used to improve the reliability of Programmable Logic Controllers, and this article methodically looks at them all. The evaluation classified PLC reliability techniques into Root Cause Analysis (RCA), Reliability Centered Maintenance (RCM), Hazard analysis (HA), Reliability block diagram (RBD), Fault tree analysis (FTA), Physics of failure (PoF) and FMEA/FMECA, after thoroughly reviewing the body of literature. The proportion of reviewed papers using either RCA, RCM, FMEA/FMECA, FTA, RBD, RCM, PoF, or Hazard analysis to increase the reliability of PLCs showed that RCA, which makes up 20% of the publications reviewed, has been used the most to increase the reliability of the PLC, followed by HA, RCM, RBD, FTA, and PoF, which account for 17%, 16%, 16%,13%, 10%, and 8% of the articles reviewed, respectively. The paper discusses new developments and trends in PLC reliability, such as the application of machine learning (ML) and artificial intelligence (AI) to fault detection and predictive maintenance.
Figures
PreviousNext
Review Article
Open Access April 16, 2024

Revolutionizing Automotive Supply Chain: Enhancing Inventory Management with AI and Machine Learning

Abstract Consumer behavior is evolving, demanding a wide range of products with fast shipping and reliable service. The automotive aftermarket industry, worth billions, requires efficient distribution systems to stay competitive. Manufacturers strive to balance growth with product and service excellence. Distributors and retailers face the challenge of maintaining competitive pricing while keeping [...] Read more.
Consumer behavior is evolving, demanding a wide range of products with fast shipping and reliable service. The automotive aftermarket industry, worth billions, requires efficient distribution systems to stay competitive. Manufacturers strive to balance growth with product and service excellence. Distributors and retailers face the challenge of maintaining competitive pricing while keeping inventory levels low. An adequate supply chain and accurate product data are crucial for product availability and reducing stock issues. This ultimately increases profits and customer satisfaction.
Figures
PreviousNext
Article
Open Access November 15, 2023

Predictive Failure Analytics in Critical Automotive Applications: Enhancing Reliability and Safety through Advanced AI Techniques

Abstract Failure prediction can be achieved through prognostics, which provides timely warnings before failure. Failure prediction is crucial in an effective prognostic system, allowing preventive maintenance actions to avoid downtime. The prognostics problem involves estimating the remaining useful life (RUL) of a system or component at any given time. The RUL is defined as the time from the current time [...] Read more.
Failure prediction can be achieved through prognostics, which provides timely warnings before failure. Failure prediction is crucial in an effective prognostic system, allowing preventive maintenance actions to avoid downtime. The prognostics problem involves estimating the remaining useful life (RUL) of a system or component at any given time. The RUL is defined as the time from the current time to the time of failure. The goal is to make accurate predictions close to the failure time to provide early warnings. J S Grewal and J. Grewal provide a comprehensive definition of RUL in their paper "The Kalman Filter approach to RUL estimation." A process is a quadruple (XU f P), where X is the state space, U is the control space, P is the set of possible paths, and f represents the transition between states. The process involves applying control values to change the system's state over time.
Figures
PreviousNext
Article
Open Access February 15, 2024

Stock Closing Price and Trend Prediction with LSTM-RNN

Abstract The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use [...] Read more.
The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing price of the Amazon Inc. stock (ticker symbol: AMZN). We study the influence of various hyperparameters in the model to see what factors the predictive power of the model. The root mean squared error (RMSE) on the training was 2.51 with a mean absolute percentage error (MAPE) of 1.84%.
Figures
PreviousNext
Article
Open Access December 03, 2023

Evolution of Enterprise Applications through Emerging Technologies

Abstract The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various [...] Read more.
The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various industries. Grasping the concept of artificial intelligence and its application in diverse business applications is crucial, given its broad and intricate nature. The primary focus of this paper is to delve into the realm of artificial intelligence and its utilization within enterprise resource planning. The study not only explores artificial intelligence but also delves into related concepts such as machine learning, deep learning, and neural networks in greater detail. Drawing upon existing literature, this research examines various books and online resources discussing the intersection of artificial intelligence and ERP. The findings reveal that the impact of AI is evident as businesses attain heightened levels of analytical efficiency across different ERP domains, thanks to remarkable advancements in AI, machine learning, and deep learning. Artificial intelligence is extensively employed in numerous ERP areas, with a particular emphasis on customer support, predictive analysis, operational planning, and sales projections.
Review Article
Open Access June 21, 2022

Create a Book Recommendation System using Collaborative Filtering

Abstract One of the most important applications of data science is the recommendation system. Every organization requires a good recommendation system to express a large range of items. This research focuses on the creation of a book recommender system using collaborative filtering.
One of the most important applications of data science is the recommendation system. Every organization requires a good recommendation system to express a large range of items. This research focuses on the creation of a book recommender system using collaborative filtering.
Figures
PreviousNext
Mini Review
Open Access May 06, 2022

Movie Recommendation System Modeling Using Machine Learning

Abstract The task of recommending products to customers based on their interests is important in business. It is possible to accomplish this with machine learning. To reduce human effort by proposing movies based on the user's interests efficiently and effectively without wasting much time in pointless browsing, the movie recommendation system is designed to assist movie aficionados. This work focuses on [...] Read more.
The task of recommending products to customers based on their interests is important in business. It is possible to accomplish this with machine learning. To reduce human effort by proposing movies based on the user's interests efficiently and effectively without wasting much time in pointless browsing, the movie recommendation system is designed to assist movie aficionados. This work focuses on developing a movie recommender system using a model that incorporates both cosine similarity and sentiment analysis. Cosine similarity is a standard used to determine how similar two items are to one another. An examination of the emotions expressed in a movie review can determine how excellent or negative a review is and, consequently the overall rating for a film. As a result, determining whether a review is favorable or adverse may be automated because the machine learns by training and evaluating the data. Comparing different systems based on content-based approaches will produce results that are increasingly explicit as time passes.
Figures
PreviousNext
Article

Query parameters

Keyword:  Machine Learning

View options

Citations of

Views of

Downloads of