Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access March 18, 2023

The Efficiency of the Proposed Smoothing Method over the Classical Cubic Smoothing Spline Regression Model with Autocorrelated Residual

Abstract Spline smoothing is a technique used to filter out noise in time series observations when predicting nonparametric regression models. Its performance depends on the choice of the smoothing parameter. Most of the existing smoothing methods applied to time series data tend to over fit in the presence of autocorrelated errors. This study aims to determine the optimum performance value, goodness of [...] Read more.
Spline smoothing is a technique used to filter out noise in time series observations when predicting nonparametric regression models. Its performance depends on the choice of the smoothing parameter. Most of the existing smoothing methods applied to time series data tend to over fit in the presence of autocorrelated errors. This study aims to determine the optimum performance value, goodness of fit and model overfitting properties of the proposed Smoothing Method (PSM), Generalized Maximum Likelihood (GML), Generalized Cross-Validation (GCV), and Unbiased Risk (UBR) smoothing parameter selection methods. A Monte Carlo experiment of 1,000 trials was carried out at three different sample sizes (20, 60, and 100) and three levels of autocorrelation (0.2, 05, and 0.8). The four smoothing methods' performances were estimated and compared using the Predictive Mean Squared Error (PMSE) criterion. The findings of the study revealed that: for a time series observation with autocorrelated errors, provides the best-fit smoothing method for the model, the PSM does not over-fit data at all the autocorrelation levels considered ( the optimum value of the PSM was at the weighted value of 0.04 when there is autocorrelation in the error term, PSM performed better than the GCV, GML, and UBR smoothing methods were considered at all-time series sizes (T = 20, 60 and 100). For the real-life data employed in the study, PSM proved to be the most efficient among the GCV, GML, PSM, and UBR smoothing methods compared. The study concluded that the PSM method provides the best fit as a smoothing method, works well at autocorrelation levels (ρ=0.2, 0.5, and 0.8), and does not over fit time-series observations. The study recommended that the proposed smoothing is appropriate for time series observations with autocorrelation in the error term and econometrics real-life data. This study can be applied to; non – parametric regression, non – parametric forecasting, spatial, survival, and econometrics observations.
Figures
PreviousNext
Article
Open Access December 27, 2023

Leveraging Artificial Intelligence to Enhance Supply Chain Resilience: A Study of Predictive Analytics and Risk Mitigation Strategies

Abstract The management of supply chains is increasingly complex. This study provides a comparative analysis of the cost-benefit analysis for managing various risks. It identifies the financial implications of leveraging artificial intelligence in supply chains to better address risk. Empirical results show a business case for managing some sources of risk more proactively facilitated through predictive [...] Read more.
The management of supply chains is increasingly complex. This study provides a comparative analysis of the cost-benefit analysis for managing various risks. It identifies the financial implications of leveraging artificial intelligence in supply chains to better address risk. Empirical results show a business case for managing some sources of risk more proactively facilitated through predictive modeling techniques offered by AI. Across investigation streams, the use of AI results in an average total cost saving ranging from 41,254 to 4,099,617. Findings from our research can be used to inform managers and theorists about the implications of integrating AI technologies to manage risks in the supply chain. Our work also highlights areas for future research. Given the growing interest in studying sub-second forecasting, our research could be a point of departure for future investigations aimed at considering the impact of forecasting horizons such as an intra-day basis. We formulate a conceptual framework that considers how and to what extent performance evaluation metrics vary according to differences in the fidelity of predictive models and factor importance for identifying risks. We also utilize a mixed-method approach to demonstrate the applicability of our ideas in practice. Our results illustrate the financial implications of integrating AI predictive tools with business processes. Results suggest that real-world companies can circumvent inefficiencies associated with trying to manage many classes of risk via the use of AI-enhanced predictive analytics. As managers need to justify investment to top management, our work supports decision-making by providing a means of conducting a trade-off analysis at the tactical level.
Figures
PreviousNext
Review Article
Open Access December 27, 2022

Integrating generative AI into financial reporting systems for automated insights and decision support

Abstract Generative AI refers to deep learning technology that can automatically produce original text, images, audio, video, and other outputs. With its emerging capabilities, Generative AI can radically change the dynamics of key operational processes in most industries. In this document, we illustrate how it is possible to integrate Generative AI technologies into the Financial Reporting System (FRS) of [...] Read more.
Generative AI refers to deep learning technology that can automatically produce original text, images, audio, video, and other outputs. With its emerging capabilities, Generative AI can radically change the dynamics of key operational processes in most industries. In this document, we illustrate how it is possible to integrate Generative AI technologies into the Financial Reporting System (FRS) of a corporation. The integration will allow the FRS to deliver on demand concise and lucid insights to its associated users on what is happening in the corporation and different aspects of the corporation performance assessment, such as its liquidity, solvency, profitability, organizational structure, and share buy back decision. The integration will also facilitate the delivery of what-if analyses associated with different strategic and tactical decisions taken by the corporation management, such as capital budgeting and profit distribution decisions. The unique added value of several attributes of these insightful analytics is automating the responses to ongoing requests of the FRS users on the corporation. Generative AI capabilities are rapidly expanding. The integration can be applied not only for the corporate FRS but any FRS at the national or global levels delivered by a central bank or an accounting standards setter. Any of these FRS can be made into a unique “hub” for the integrated Generative AI technologies. An equally innovative possible generalized integration could put any corporate process at the center and its supporting FRS tasks and deliverables in its periphery.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

An Analysis of Crime Prediction and Classification Using Data Mining Techniques

Abstract Crime is a serious and widespread problem in their society, thus preventing it is essential. Assignment. A significant number of crimes are committed every day. One tool for dealing with model crime is data mining. Crimes are costly to society in many ways, and they are also a major source of frustration for its members. A major area of machine learning research is crime detection. This paper [...] Read more.
Crime is a serious and widespread problem in their society, thus preventing it is essential. Assignment. A significant number of crimes are committed every day. One tool for dealing with model crime is data mining. Crimes are costly to society in many ways, and they are also a major source of frustration for its members. A major area of machine learning research is crime detection. This paper analyzes crime prediction and classification using data mining techniques on a crime dataset spanning 2006 to 2016. This approach begins with cleaning and extracting features from raw data for data preparation. Then, machine learning and deep learning models, including RNN-LSTM, ARIMA, and Linear Regression, are applied. The performance of these models is evaluated using metrics like Root Mean Squared Error (RMSE) and Mean Absolute Percentage Error (MAPE). The RNN-LSTM model achieved the lowest RMSE of 18.42, demonstrating superior predictive accuracy among the evaluated models. Data visualization techniques further unveiled crime patterns, offering actionable insights to prevent crime.
Figures
PreviousNext
Article

Query parameters

Keyword:  Metrics

View options

Citations of

Views of

Downloads of