Abstract
In the constantly evolving world of cloud computing, appropriate resource allocation is essential for both keeping costs down and ensuring an ongoing flow of apps and services. Because of its adaptability to specific tasks and human behavior, machine learning (ML) is a desirable choice for fulfilling those needs. This study Efficient cloud resource allocation is critical for optimizing performance [...] Read more.
In the constantly evolving world of cloud computing, appropriate resource allocation is essential for both keeping costs down and ensuring an ongoing flow of apps and services. Because of its adaptability to specific tasks and human behavior, machine learning (ML) is a desirable choice for fulfilling those needs. This study Efficient cloud resource allocation is critical for optimizing performance and cost in cloud computing environments. In order to improve the precision of resource allocation, this study investigates the use of Long Short-Term Memory (LSTM). The LSTM model achieved 97% accuracy, 97.5% precision, 98% recall, and a 97.8% F1-score (F1-score: harmonic mean of precision and recall), according to experimental data. The confusion matrix demonstrates strong classification performance across several resource classes, while the accuracy and loss curves verify steady learning with minimal overfitting. The suggested LSTM model performs better than more conventional ML (machine learning) models like Gradient Boosting (GB) and Logistic Regression (LR), according to a comparative study. These findings underscore the LSTM (Long Short-Term Memory) model’s robustness and suitability for dynamic cloud environments, enabling more accurate forecasting and efficient resource management.