Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access June 28, 2025

Development of a Hemodialysis Data Collection and Clinical Information System and Establishment of an Intradialytic Blood Pressure/Pulse Rate Predictive Model

Abstract This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models [...] Read more.
This research is a collaboration involving a university team, a partnering corporation, and a hemodialysis clinic, which is a cross-disciplinary research initiative in the field of Artificial Intelligence of Things (AIoT) within the medical informatics domain. The research has two objectives: (1) The development of an Internet of Things (IoT)-based Information System customized for the hemodialysis machines at the clinic, including transmission bridges, clinical personnel dedicated web/app, and a backend server. The system has been deployed at the clinic and is now officially operational; (2) The research also utilized de-identified, anonymous data (collected by the officially operational system) to train, evaluate, and compare Deep Learning-based Intradialytic Blood Pressure (BP)/Pulse Rate (PR) Predictive Models, with subsequent suggestions provided. Both objectives were executed under the supervision of the Institutional Review Board (IRB) at Mackay Memorial Hospital in Taiwan. The system completed for objective one has introduced three significant services to the clinic, including automated hemodialysis data collection, digitized data storage, and an information-rich human-machine interface as well as graphical data displays, which replaces traditional paper-based clinical administrative operations, thereby enhancing healthcare efficiency. The graphical data presented through web and app interfaces aids in real-time, intuitive comprehension of the patients’ conditions during hemodialysis. Moreover, the data stored in the backend database is available for physicians to conduct relevant analyses, unearth insights into medical practices, and provide precise medical care for individual patients. The training and evaluation of the predictive models for objective two, along with related comparisons, analyses, and recommendations, suggest that in situations with limited computational resources and data, an Artificial Neural Network (ANN) model with six hidden layers, SELU activation function, and a focus on artery-related features can be employed for hourly intradialytic BP/PR prediction tasks. It is believed that this contributes to the collaborating clinic and relevant research communities.
Figures
Figure 3 (c)
Figure 3 (d)
Figure 4 (b)
Figure 4 (c)
Figure 4 (d)
Figure 4 (e)
Figure 4 (f)
Figure 4 (g)
Figure 4 (h)
Figure 5 (b)
Figure 6 (b)
Figure 6 (c)
Figure 6 (d)
Figure 6 (e)
Figure 6 (f)
Figure 7 (b)
Figure 7 (c)
Figure 7 (d)
Figure 7 (e)
Figure 7 (f)
Figure 7 (g)
Figure 8 (b)
Figure 8 (c)
Figure 8 (d)
Figure 9 (b)
Figure 9 (c)
Figure 9 (d)
Figure 10 (b)
Figure 10 (c)
Figure 10 (d)
Figure 10 (e)
Figure 10 (f)
Figure 11 (b)
Figure 11 (c)
Figure 11 (d)
Figure 11 (e)
PreviousNext
PDF Html Xml
Article
Open Access February 15, 2024

Stock Closing Price and Trend Prediction with LSTM-RNN

Abstract The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use [...] Read more.
The stock market is very volatile and hard to predict accurately due to the uncertainties affecting stock prices. However, investors and stock traders can only benefit from such models by making informed decisions about buying, holding, or investing in stocks. Also, financial institutions can use such models to manage risk and optimize their customers' investment portfolios. In this paper, we use the Long Short-Term Memory (LSTM-RNN) Recurrent Neural Networks (RNN) to predict the daily closing price of the Amazon Inc. stock (ticker symbol: AMZN). We study the influence of various hyperparameters in the model to see what factors the predictive power of the model. The root mean squared error (RMSE) on the training was 2.51 with a mean absolute percentage error (MAPE) of 1.84%.
Figures
PreviousNext
Article
Open Access December 03, 2023

Evolution of Enterprise Applications through Emerging Technologies

Abstract The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various [...] Read more.
The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various industries. Grasping the concept of artificial intelligence and its application in diverse business applications is crucial, given its broad and intricate nature. The primary focus of this paper is to delve into the realm of artificial intelligence and its utilization within enterprise resource planning. The study not only explores artificial intelligence but also delves into related concepts such as machine learning, deep learning, and neural networks in greater detail. Drawing upon existing literature, this research examines various books and online resources discussing the intersection of artificial intelligence and ERP. The findings reveal that the impact of AI is evident as businesses attain heightened levels of analytical efficiency across different ERP domains, thanks to remarkable advancements in AI, machine learning, and deep learning. Artificial intelligence is extensively employed in numerous ERP areas, with a particular emphasis on customer support, predictive analysis, operational planning, and sales projections.
Review Article
Open Access December 15, 2022

Effective Parameters to Design an Automatic Parking System

Abstract The automated parking system is an extensive branch of smart transport systems. The smartness of such systems is determined by different parameters such as parking maneuver planning. Coding this control system includes vehicle parking and understanding the environment. A high-quality classification mask has been used on each sample to analyze the automated vehicle parking parameters. Mask [...] Read more.
The automated parking system is an extensive branch of smart transport systems. The smartness of such systems is determined by different parameters such as parking maneuver planning. Coding this control system includes vehicle parking and understanding the environment. A high-quality classification mask has been used on each sample to analyze the automated vehicle parking parameters. Mask region-based convolutional neural networks (R-CNN) was taught using a small computational workload titled faster R-CNN that operates in five frames per second. In this paper, the rapidly-exploring random tree (RRT) method was used for routing the parking space and a nonlinear model predictive control (NMPC) controller was added to develop this system. We add the line detection algorithm commands to the mask R-CNN algorithm. The results can be useful to design a secure automatic parking system as well as a powerful perception system.
Figures
PreviousNext
Article
Open Access November 04, 2022

An Artificial Intelligence Approach to Manage Crop Water Requirements in South Africa

Abstract Estimation of crop water requirements is of paramount importance towards the management of agricultural water resources, which is a major mitigating strategy against the effects of climate change on food security. South Africa water shortage poses a threat on agricultural efficiency. Since irrigation uses about 60% of the fresh water available, it therefore becomes important to optimise the use of [...] Read more.
Estimation of crop water requirements is of paramount importance towards the management of agricultural water resources, which is a major mitigating strategy against the effects of climate change on food security. South Africa water shortage poses a threat on agricultural efficiency. Since irrigation uses about 60% of the fresh water available, it therefore becomes important to optimise the use of irrigation water in order to maximize crop yield at the farm level in order to avoid wastage. In this study, combined application of an artificial neural network (ANN) and a crop – growth simulation model for the estimation of crop irrigation water requirements and the irrigation scheduling of potatoes at Winterton irrigation scheme, South Africa was investigated. The crop-water demand from planting to harvest date, when to irrigate, the optimum stage in the drying cycle when to apply water and the amount of irrigation water to be applied per time, were estimated in this study. Five feed –forward backward propagation artificial neural network predictive models were developed with varied number of neurons and hidden layers and evaluated. The optimal ANN model, which has 5 inputs, 5 neurons, 1 hidden layer and 1 output was used to predict monthly reference evapotranspiration (ETo) in the Winterton area. The optimal ANN model produced a root-mean-square error (RMSE) of 0.67, Pearson correlation coefficient (r) of 0.97 and coefficient of determination (R2) of 0.94. The validation of the model between the measured and predicted ETo shows a r value of 0.9048. The predicted ETo was one of the input variables into a crop growth simulation model, called CROPWAT. The results indicated that the total crop water requirement was 1259.2 mm/decade and net irrigation water requirement was 1276.9 mm/decade, spread over a 5-day irrigation time during the entire 140 days of cropping season for potatoes. A combination of the artificial neural networks and the crop growth simulation models have proved to be a robust technique for estimating crop irrigation water requirements in the face of limited or no daily meteorological datasets.
Figures
PreviousNext
Article
Open Access October 17, 2021

Understanding Traffic Signs by an Intelligent Advanced Driving Assistance System for Smart Vehicles

Abstract Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a [...] Read more.
Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a huge number of sensors and processing units to provide a complete overview of the surrounding objects to the driver. In this paper, we introduce a road signs classifier for an ADAS to recognize and understand traffic signs. This classifier is based on a deep learning technique, and, in particular, it uses Convolutional Neural Networks (CNN). The proposed approach is composed of two stages. The first stage is a data preprocessing technique to filter and enhance the quality of the input images to reduce the processing time and improve the recognition accuracy. The second stage is a convolutional CNN model with a skip connection that allows passing semantic features to the top of the network in order to allow for better recognition of traffic signs. Experiments have proved the performance of the CNN model for traffic sign classification with a correct recognition rate of 99.75% on the German traffic sign recognition benchmark GTSRB dataset.
Figures
PreviousNext
Article
Open Access August 20, 2022

Advancing Predictive Failure Analytics in Automotive Safety: AI-Driven Approaches for School Buses and Commercial Trucks

Abstract The recent evidence on AI in automotive safety shows the potential to reduce crashes and improve efficiency. Studies used AI techniques like machine learning and predictive analytics models to develop predictive collision avoidance systems. The studies collected data from various sources, such as traffic collision data and shapefiles. They utilized deep learning neural networks and 3D [...] Read more.
The recent evidence on AI in automotive safety shows the potential to reduce crashes and improve efficiency. Studies used AI techniques like machine learning and predictive analytics models to develop predictive collision avoidance systems. The studies collected data from various sources, such as traffic collision data and shapefiles. They utilized deep learning neural networks and 3D visualization techniques to analyze the data. However, there needs to be more research on AI in school bus and commercial truck safety. This paper explores the importance of AI-driven predictive failure analytics in enhancing automotive safety for these vehicles. It will discuss challenges, required data, technologies involved in predictive failure analytics, and the potential benefits and implications for the future. The conclusion will summarize the findings and emphasize the significance of AI in improving driver safety. Overall, this paper contributes to the field of automotive safety and aims to attract more research in this area.
Figures
PreviousNext
Review Article
Open Access October 29, 2022

Neural Networks for Enhancing Rail Safety and Security: Real-Time Monitoring and Incident Prediction

Abstract The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the [...] Read more.
The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the improvements in monitoring can be achieved using artificial intelligence algorithms such as neural networks. Neural networks have been used to achieve real-time incident identification in monitoring the track quality in terms of classifying the graphical outputs of an ultrasonic system working with the rails and track bed, to predict incidents on the rail infrastructure due to transmission channels becoming blocked, and also to attempt scheduling preemptive and preventative maintenance. In terms of forecasting incidents and accidents on board the trains, neural networks have been used to model passenger behavior and optimize responses during a train station evacuation. In tackling the incidents and accidents occurring on rail transport, we contribute with two methodologies to detect anomalies in real-time and identify the level of security risk: at the maintenance level with personnel operating along the railways, and onboard passenger trains. These methodologies were evaluated on real-world datasets and shown to be able to achieve a high accuracy in the results. The results generated from these case studies also reveal the potential for network-wide applications, which could enhance security and safety on railway networks by offering the possibility of better managing network disruptions and more rapidly identifying security issues. The speed and coverage of the information generated through the implementation of these methodologies have implications in utilizing prediction for decision support and enhancing safety and security on board the rail network.
Figures
PreviousNext
Review Article
Open Access November 05, 2022

Application of Neural Networks in Optimizing Health Outcomes in Medicare Advantage and Supplement Plans

Abstract The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, [...] Read more.
The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, personalized treatment recommendations, and risk stratification. This paper explores the application of neural networks in enhancing health outcomes within the context of Medicare Advantage and Supplement plans. We review how deep learning models can be leveraged to predict patient risk, optimize resource allocation, and identify at-risk populations for preventive interventions. Additionally, we discuss the potential for neural networks to improve claims processing, reduce fraud, and streamline administrative burdens. By integrating various data sources, including medical records, claims data, and demographic information, neural networks enable more accurate and efficient decision-making processes. Ultimately, this approach can lead to better patient care, reduced healthcare costs, and improved satisfaction for beneficiaries of these programs. The paper concludes by highlighting the current limitations, ethical considerations, and future directions for AI adoption in the Medicare Advantage and Supplement sectors.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Predictive Analytics and Deep Learning for Logistics Optimization in Supply Chain Management

Abstract Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the [...] Read more.
Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the operations of a supply chain. An approach is presented on how predictive analytics can be used to improve logistics operations. In order to analyze big data in logistics effectively, an artificial intelligence computational technique, specifically deep learning, is employed. Two case studies are illustrated to demonstrate the practical employability of the proposed technique. This reveals the power and potential of using predictive analytics in logistics to project various KPI values ahead in the future based on the contemporary data from the logistics operations; sheds light on the innovative technique of employing deep learning through deep learning-based predictive analytics in logistics; suggests incorporating innovative techniques like deep learning with predictive analytics to develop an accurate forecasting technique in logistics and optimize operations and prevent disruption in the supply chain. The network of supply chains has become more complex, necessitating the need for the latest technological advancements. The sectors that have gained a fair amount of attention for the application of technology to optimize their operations are manufacturing, healthcare, aerospace, and the automotive industry. A little attention has been diverted to the logistics sector; many describe how analytics and artificial intelligence can be used in the logistics sector to achieve higher optimization. Currently, significant research has been done in optimizing logistics operations. Nevertheless, with the explosive volume of historical data being produced by the logistics operations of an organization, there is a great opportunity to learn valuable insights from the data accumulated over time for more long-term strategic planning. To develop the logistics operations in an organization, the use of historical data is essential to understand the trends in the operations. For example, regular maintenance planning and resource allocation based on trends are long-term activities that will not affect logistics operations immediately but can affect the business’s strategic planning in the long run. A predictive analysis technique employed on historical data of logistics can narrow down conclusions based on the future trends of logistics operations. Thus, the technique can be used to prevent the disruption of the supply chain.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

Advancing Pain Medicine with AI and Neural Networks: Predictive Analytics and Personalized Treatment Plans for Chronic and Acute Pain Managements

Abstract There is a growing body of evidence that the number of individuals suffering from chronic and acute pain is under-reported and the burden of the veteran, aging, athletic, and working populations is rising. Current pain management is limited by our capacity to collaborate with individuals continuing normal daily functions and self-administration of pain treatments outside of traditional healthcare [...] Read more.
There is a growing body of evidence that the number of individuals suffering from chronic and acute pain is under-reported and the burden of the veteran, aging, athletic, and working populations is rising. Current pain management is limited by our capacity to collaborate with individuals continuing normal daily functions and self-administration of pain treatments outside of traditional healthcare appointments and hospital settings. In this review, the current gap in clinical care for real-time feedback and guidance with pain management decision-making for chronic and post-operative pain treatment is defined. We examine the recent and future applications for predictive analytics of opioid use after surgery and implementing real-time neural networks for personalized pain management goal setting for particular individuals on the path to discharge to normal function. Integration of personalized neural networks with longitudinal data may enable the development of future treatment personalizations paired with electrical simulations.
Figures
PreviousNext
Review Article
Open Access December 27, 2023

Leveraging Machine Learning Techniques for Predictive Analysis in Merger and Acquisition (M&A)

Abstract M&A is a strategic concept of business growth through consolidation, gaining market access, increasing strategic positions, and increasing operational efficiency. To understand the dynamics of M&A, this paper looks at aspects such as targeted firm identification, evaluation, bidding for the target firm, and post-acquisition integration. All forms of M&A, including horizontal, [...] Read more.
M&A is a strategic concept of business growth through consolidation, gaining market access, increasing strategic positions, and increasing operational efficiency. To understand the dynamics of M&A, this paper looks at aspects such as targeted firm identification, evaluation, bidding for the target firm, and post-acquisition integration. All forms of M&A, including horizontal, vertical, conglomerate, and acquisitions, are discussed in terms of goals and values, including synergy, cost reduction, competitive advantages, and access to better technology. However, issues such as cultural assimilation, adhesion to regulations, and calculating an inaccurate value are also resolved. The paper then goes deeper to provide insight into how predictive analytics applies to M&A, using ML to improve decision-making with forecasting benefits. Including healthcare, education, and construction industries, the presented predictive models using regression analysis, neural networks, and ensemble techniques help to make decisions. Through time series and real-time data, PDA enables sound M&A strategies, effective risk management and smooth integration.
Figures
PreviousNext
Review Article
Open Access November 16, 2022

AI-Driven Automation in Monitoring Post-Operative Complications Across Health Systems

Abstract Artificial intelligence systems have been previously used to predict post-operative complications in small studies and single institutions. Here we developed a robust artificial intelligence model that predicts the risk of having cardiac, pulmonary, thromboembolic, or septic complications after elective, non-cardiac, non-ambulatory surgery. We combined structured and unstructured electronic health [...] Read more.
Artificial intelligence systems have been previously used to predict post-operative complications in small studies and single institutions. Here we developed a robust artificial intelligence model that predicts the risk of having cardiac, pulmonary, thromboembolic, or septic complications after elective, non-cardiac, non-ambulatory surgery. We combined structured and unstructured electronic health record data from 3.5 million surgical encounters from 25 medical centers between 2009 and 2017. Our neural network model predicted postoperative comorbidities 15 to 80 times faster than classical models. As such, our model can be used to assess the risk of having a specific complication postoperatively in a fraction of a second. With our model, we believe clinicians will be able to identify high-risk surgical patients and use their good judgment to mitigate upcoming risks, ultimately improving patient outcomes [1].
Figures
PreviousNext
Case Report
Open Access December 29, 2020

A Deep Learning Architectures for Enhancing Cyber Security Protocols in Big Data Integrated ERP Systems

Abstract Deep learning approaches are very useful to enhance cybersecurity protocols for industry-integrated big data enterprise resource planning systems. This research study develops deep learning architectures of variational autoencoder, sparse autoencoder, and deep belief network for detecting anomalies, fraud, and preventing cybersecurity attacks. These cybersecurity issues occur in finance, human [...] Read more.
Deep learning approaches are very useful to enhance cybersecurity protocols for industry-integrated big data enterprise resource planning systems. This research study develops deep learning architectures of variational autoencoder, sparse autoencoder, and deep belief network for detecting anomalies, fraud, and preventing cybersecurity attacks. These cybersecurity issues occur in finance, human resources, supply chain, and marketing in the big data integrated ERP systems or cloud-based ERP systems. The main objectives of this creative research work are to identify the vulnerabilities in various ERP systems, databases, and the interconnected domains; to introduce a conceptual cybersecurity network model that incorporates variational autoencoders, sparse autoencoders, and deep belief networks; to evaluate the performance of the proposed cybersecurity model by employing the appropriate parameters with real-time and synthetic databases and simulated scenarios; and to validate the model performance by comparing it with traditional algorithms. A big data platform with an integrated business management system is known as an integrated ERP system, which plays an instrumental role in conducting business for various organizations in society. In recent times, as uncertainty and disparity increase, the cyber ecosystem becomes more complex, volatile, dynamic, and unpredictable. In particular, the number of cyber-attacks is increasing at an alarming rate; the resultant security breaches have a disruptive and disturbing effect on businesses around the world, with a loss of billions of dollars. To combat these threats, it is essential to develop a conceptual cybersecurity network model to secure systems by functioning as a mutually supporting and strengthening network model rather than working in isolation. In this dynamic and fluid environment, introducing a deep learning approach helps to support and prevent fraud and other illicit activities related to human resources and the supply chain, among others. Some cybersecurity vulnerabilities include, for example, database vulnerabilities, service level vulnerabilities, and system vulnerabilities, among others. The proposed methodology focuses only on database vulnerabilities, with the main aim of detecting and mitigating new potential vulnerabilities in other dependent domains as a future initiative.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

The Role of Neural Networks in Advancing Wearable Healthcare Technology Analytics

Abstract Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical [...] Read more.
Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical healthcare technology, crawling through some industry giants. Wearable Healthcare Technologies are becoming more popular every day. These technologies facilitate collecting, monitoring, and sharing every vital aspect of the human body necessary for diagnosing and treating an ailment. At the advent of global digitization, health data storage and systematic analysis are taking shape to ensure better diagnostics, preventive, and predictive healthcare. Healthcare analytics powered by neural networks can significantly improve health outcomes, maximizing individuals' potential and quality of life. The breadth and possibilities of connected devices are getting wider. From personal activity monitoring to quantifying every bit of health statistics, connected devices are making an impact in measurement, management, and manipulation. In healthcare, early diagnosis could be a lifesaver. Data analytics can help in a big way to make moves and predictions to save lives. We are in another phase of the digitization era, "Neural Network and Wearable Healthcare Technology Analytics." A neural network could be conceived as an adaptive system made up of a large number of neurons connected in multiple layers. A neural network processes data in a similar way as the human brain does. Using a collection of algorithms, for many neural networks, objects are composed of 'input' and 'output' layers along with the layers of the neural network.
Figures
PreviousNext
Review Article
Open Access December 26, 2021

Deep Learning Applications for Computer Vision-Based Defect Detection in Car Body Paint Shops

Abstract The major automated plants have produced large volumes of high-quality products at low cost by introducing various technologies, including robotics and artificial intelligence. The code of many defects on the surface of products is embedded with economic loss and sometimes functionality loss because products are rarely found with defects. Therefore, most items’ production is done based on [...] Read more.
The major automated plants have produced large volumes of high-quality products at low cost by introducing various technologies, including robotics and artificial intelligence. The code of many defects on the surface of products is embedded with economic loss and sometimes functionality loss because products are rarely found with defects. Therefore, most items’ production is done based on prediction and has an invisible fluctuation in production. The detection process for hidden defect images requires a lot of costs and needs to be supported for better progress and quality enhancement. Paint shop defects should be analyzed from color changes to detect defects effectively by preventing the variability of product demand over time. It is not easy to take visible light images without noise because the paint surfaces are glossy. A few parts of illumination and shadows remain in images, even in larger size and high-resolution images. The various painted surfaces are also needed to reflect both color and texture information in computer vision models to classify defects precisely. Several automated detection systems have been applied to paint shop inspections using lasers, infrared, x-ray, electrical, magnetic, and acoustic sensors. The chance of paint shop defects can be low, unnecessarily low, compared to clouds in the sky, but those chances impact defect functionalities. Thus, they are called as “lessons learned.” Lately, artificial intelligence has been introduced to the field of factory automation, and many defect detection feeds have found footsteps in machine learning and deep learning. Recent attempts at deep learning-based defect detection are proposing simple techniques using specific neural network architectures with big data. However, big data is still in its early stages, and significant challenges exist in normalizing and annotating such data. To get cost-efficient and timely solutions tailored to automotive paint shops, it might be a better consideration to combine deep learning solutions with traditional computer vision and more elaborate machine learning methods.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

Advance of AI-Based Predictive Models for Diagnosis of Alzheimer's Disease (AD) in Healthcare

Abstract The effects on the elderly are disproportionately Alzheimer’s disease (AD) is one of the most prevalent and chronic types of dementia. Alzheimer's disease (AD), a fatal illness that can harm brain structures and cells long before symptoms appear, is currently incurable and incurable. Using brain MRI pictures from a publicly accessible Kaggle dataset, this study suggests a prediction model based [...] Read more.
The effects on the elderly are disproportionately Alzheimer’s disease (AD) is one of the most prevalent and chronic types of dementia. Alzheimer's disease (AD), a fatal illness that can harm brain structures and cells long before symptoms appear, is currently incurable and incurable. Using brain MRI pictures from a publicly accessible Kaggle dataset, this study suggests a prediction model based on Convolutional Neural Networks (CNNs) to help with the early detection of Alzheimer's disease. Four levels of dementia have been applied to the 6,400 photos in the collection: not demented, slightly demented, moderately demented, and considerably mildly demented. Pixel normalization, class balancing utilizing data augmentation techniques, and picture scaling to 128×128 pixels were all part of a thorough workflow for data preparation. To improve the gathering of spatial dependence in volumetric MRI data, a 3D convolutional neural network (CNN) architecture was used. We used important performance measures including F1-score, recall, accuracy, precision, and log loss to gauge the model's effectiveness. A review of the available data indicates that the total F1-score, accuracy, recall, and precision were 99.0%, 99.0%, and 99.38%, respectively. The findings demonstrate the model's potential for practical use in early AD diagnosis and establish its robustness with the help of confusion matrix analysis and performance curves.
Figures
PreviousNext
Article
Open Access December 27, 2022

Big Data-Driven Time Series Forecasting for Financial Market Prediction: Deep Learning Models

Abstract Financial markets have become more and more complex, so has been the number of data sources. Stock price prediction has hence become a tough but important task. The time dependencies in stock price movements tend to escape from traditional models. In this work, a hybrid ARIMA-LSTM model is suggested to enhance accuracy of stock price forecasts. Based on time series indicators like adjusted closing [...] Read more.
Financial markets have become more and more complex, so has been the number of data sources. Stock price prediction has hence become a tough but important task. The time dependencies in stock price movements tend to escape from traditional models. In this work, a hybrid ARIMA-LSTM model is suggested to enhance accuracy of stock price forecasts. Based on time series indicators like adjusted closing prices of S&P 500 stocks over a decade (2010–2019), the ARIMA-LSTM model combines influences of both autoregressive time series forecasting with the substantial sequence learning property of LSTM. Data preprocessing in all aspects including missing values interpolation, outlier’s detection and data scaling – Min-Max guarantees data quality. The model is trained on 90/10 training/testing split and met with main performance metrics: MaE, MSE & RMSE. As indicated in the results, the proposed ARIMA-LSTM model gives a MAE value and MSE value of 0.248 and 0.101 respectively and RMSE of 0.319, a measure high accuracy on stock price prediction. Coupled comparative analysis with other Artificial Neural Networks (ANN) and BP Neural Networks (BPNN) are examples of machine learning reference models, further illustrates the suitability and superiority of ARIMA-LSTM approach as compared to the underlying models with the least MAE and strong predictive capability. This work demonstrates the efficiency of integrating the classical time series models with deep learning methods for financial forecasting.
Figures
PreviousNext
Article
Open Access December 20, 2024

AI for Time Series and Anomaly Detection

Abstract Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent [...] Read more.
Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent advances in artificial intelligence particularly deep learning architectures such as recurrent neural networks (RNNs), convolutional neural networks (CNNs), temporal convolutional networks (TCNs), graph neural networks (GNNs) and Transformers have demonstrated marked improvements in modeling both univariate and multivariate series, as well as in detecting anomalies that deviate from learned norms (Darban, Webb, Pan, Aggarwal, & Salehi, 2022; Chiranjeevi, Ramya, Balaji, Shashank, & Reddy, 2024) [1,2]. Moreover, ensemble techniques and hybrid signal-processing + deep-learning pipelines show enhanced sensitivity and adaptability in real-world anomaly detection scenarios (Iqbal, Amin, Alsubaei, & Alzahrani, 2024) [3]. In this work, we provide a unified survey and comparative analysis of AI-driven time series forecasting and anomaly detection methods, highlight key industrial application domains, evaluate performance trade-offs (e.g., accuracy vs. latency, supervised vs. unsupervised learning), and discuss emerging challenges including interpretability, data drift, real-time deployment on edge devices, and integration of causal reasoning. Our findings suggest that while AI approaches significantly outperform classical techniques in many settings, careful consideration of data characteristics, evaluation metrics and deployment environment remains essential for effective adoption.
Article

Query parameters

Keyword:  Neural Networks

View options

Citations of

Views of

Downloads of