Current Research in Public Health
Volume 1, Issue 1, 2016
Open Access December 27, 2021 8 pages 1487 views 342 downloads

A Comparative Study for Recommended Triage Accuracy of AI Based Triage System MayaMD with Indian HCPs

Current Research in Public Health 2021, 1(1), 198. DOI: 10.31586/jaibd.2021.198
Abstract
Artificial intelligence (AI) based triage and diagnostic systems are increasingly being used in healthcare. Although these online tools can improve patient care, their reliability and accuracy remain variable. We hypothesized that an artificial intelligence (AI) powered triage and diagnostic system (MayaMD) would compare favorably with human doctors with respect to triage and diagnostic accuracy.
[...] Read more.
Artificial intelligence (AI) based triage and diagnostic systems are increasingly being used in healthcare. Although these online tools can improve patient care, their reliability and accuracy remain variable. We hypothesized that an artificial intelligence (AI) powered triage and diagnostic system (MayaMD) would compare favorably with human doctors with respect to triage and diagnostic accuracy. We performed a prospective validation study of the accuracy and safety of an AI powered triage and diagnostic system. Identical cases were evaluated by an AI system and individual Indian healthcare practitioners (HCPs) to draw comparison for accuracy and safety. The same cases were validated with the help of consensus received from an expert panel of 3 doctors. These cases in the form of clinical vignettes were provided by an expert medical team. Overall, the study showed that the MayaMD AI based platform for virtual triage was able to recommend the most appropriate triage ensuring patient safety. In fact, the accuracy of triage recommendation by MayaMD was significantly better than that provided by individual HCPs (74% vs. 91.67%, p=0.04) with consensus being used as standard.Full article
Article
Open Access October 19, 2021 9 pages 751 views 232 downloads

A Ligthweight Wayfinding Assistance System for IoT Applications

Current Research in Public Health 2021, 1(1), 147. DOI: 10.31586/jaibd.2021.147
Abstract
In this paper, we propose to design an indoor sign detection system for industry 4.0. In order to implement the proposed system, we proposed a lightweight deep learning-based architecture based on MobileNet which can be run on embedded devices used to detect and recognize indoor landmarks signs in order to assist blind and sighted during indoor navigation. We apply various operations in order to
[...] Read more.
In this paper, we propose to design an indoor sign detection system for industry 4.0. In order to implement the proposed system, we proposed a lightweight deep learning-based architecture based on MobileNet which can be run on embedded devices used to detect and recognize indoor landmarks signs in order to assist blind and sighted during indoor navigation. We apply various operations in order to minimize the network size as well as computation complexity. Internet of things (IoT) presents a connection between internet and the surroundings objects. IoT is characterized to connect physical objects with their numerical identities and enables them to connect with each other. This technique creates a kind of bridge between the physical world and the virtual world. The paper provides a comprehensive overview of a new method for a set of landmark indoor sign objects based on deep convolutional neural network (DCNN) for internet of things applications.Full article
Article
Open Access October 17, 2021 8 pages 609 views 224 downloads

Understanding Traffic Signs by an Intelligent Advanced Driving Assistance System for Smart Vehicles

Current Research in Public Health 2021, 1(1), 148. DOI: 10.31586/jaibd.2021.148
Abstract
Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a
[...] Read more.
Recent technologies have made life smarter. vehicles are vital components in daily life that are getting smarter for a safer environment. Advanced Driving Assistance Systems (ADAS) are widely used in today's vehicles. It has been a revolutionary approach to make roads safer by assisting the driver in difficult situations like collusion, or assistance in respecting road rules. ADAS is composed of a huge number of sensors and processing units to provide a complete overview of the surrounding objects to the driver. In this paper, we introduce a road signs classifier for an ADAS to recognize and understand traffic signs. This classifier is based on a deep learning technique, and, in particular, it uses Convolutional Neural Networks (CNN). The proposed approach is composed of two stages. The first stage is a data preprocessing technique to filter and enhance the quality of the input images to reduce the processing time and improve the recognition accuracy. The second stage is a convolutional CNN model with a skip connection that allows passing semantic features to the top of the network in order to allow for better recognition of traffic signs. Experiments have proved the performance of the CNN model for traffic sign classification with a correct recognition rate of 99.75% on the German traffic sign recognition benchmark GTSRB dataset.Full article
Article
Open Access September 04, 2021 13 pages 710 views 192 downloads

Active Fault Tolerant Control of Faulty Uncertain Neutral Time-Delay Systems

Current Research in Public Health 2021, 1(1), 59. DOI: 10.31586/jaibd.2021.059
Abstract
The present paper attempts to investigate the problem of Fault Tolerant Control for a class of uncertain neutral time delay systems. In the first time, we consider an additive control that is based on adding a term to the nominal law when the fault occurs. This approach will be designed in three steps. The first step is fault detection while the second one is fault estimation. For these two steps,
[...] Read more.
The present paper attempts to investigate the problem of Fault Tolerant Control for a class of uncertain neutral time delay systems. In the first time, we consider an additive control that is based on adding a term to the nominal law when the fault occurs. This approach will be designed in three steps. The first step is fault detection while the second one is fault estimation. For these two steps, we consider the adaptive observer to guarantee the detection and estimation of the fault. The third step is the fault compensation. Lyapunov method and Linear Matrix Inequality (LMI) techniques were considered to improve the main method. Second, we propose a Pseudo Inverse Method "PIM" and determine the error between the closed loop and the nominal system. Finally, simulation results are presented to prove the theoretical development for an example of an uncertain neutral time delay system.Full article
Article
Open Access July 23, 2021 17 pages 1766 views 459 downloads

Behavioral Economics and Energy Consumption: Behavioral Data Analysis the Role of Attitudes and Beliefs on Household Electricity Consumption in Iran

Current Research in Public Health 2021, 1(1), 54. DOI: 10.31586/jaibd.2021.010101
Abstract
The average electricity consumption in Iranian households is higher than the world average. This can be due to price factors (such as cheap electricity in the country) and non-price factors (such as socio-demographic variables and psychological factors). In this study, non-price factors such as socio-demographic variables and psychological factors in the electricity consumption of urban households
[...] Read more.
The average electricity consumption in Iranian households is higher than the world average. This can be due to price factors (such as cheap electricity in the country) and non-price factors (such as socio-demographic variables and psychological factors). In this study, non-price factors such as socio-demographic variables and psychological factors in the electricity consumption of urban households in Tehran were investigated. In this regard, using the theoretical foundations of behavioral economics and the psychology of planned behavior, this issue was analyzed. This study collected information on household electricity consumption behavior through a questionnaire and fieldwork from 2560 Tehran households. Results Using econometric techniques, linear regression was estimated, the dependent variable of which was electricity consumption (45 days in winter 2019) and its independent variables including socio-demographic variables (age, sex, number of household members, income) and The variables of the theory of planned behavior (attitude, mental norms and perceived behavioral control) showed that income and the number of household members have a significant and positive effect on electricity consumption, but gender has no significant effect. Of the psychological variables, only perceived behavioral control has a significant effect on electricity consumption. These results show that the consumer does not have a positive attitude towards saving, and mental and social norms do not encourage him to reduce electricity consumption, and they are not effective in consumption control. Finally, the study results were analyzed using behavioral biases that may cause attitudes and beliefs not to lead to action.Full article
Article
Open Access December 27, 2021 9 pages 308 views 117 downloads

Leveraging AI and ML for Enhanced Efficiency and Innovation in Manufacturing: A Comparative Analysis

Current Research in Public Health 2021, 1(1), 943. DOI: 10.31586/jaibd.2021.943
Abstract
The manufacturing industry has embraced modern technologies such as big data, machine learning, and artificial intelligence. This paper examines AI and machine learning developments in the manufacturing industry, comparing current practices and data-driven projects. It aims better to understand these technologies and their potential benefits and challenges. The research identifies opportunities
[...] Read more.
The manufacturing industry has embraced modern technologies such as big data, machine learning, and artificial intelligence. This paper examines AI and machine learning developments in the manufacturing industry, comparing current practices and data-driven projects. It aims better to understand these technologies and their potential benefits and challenges. The research identifies opportunities for innovative business solutions and explores industry practices and research results. The paper focuses on implementation rather than technical aspects, aiming to enhance knowledge in this area.Full article
Review Article
Open Access December 27, 2020 10 pages 329 views 55 downloads

Exploring AI Algorithms for Cancer Classification and Prediction Using Electronic Health Records

Current Research in Public Health 2021, 1(1), 1109. DOI: 10.31586/jaibd.2020.1109
Abstract
Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer
[...] Read more.
Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer detection, utilizing the publicly available CBIS-DDSM dataset, which comprises 5,000 images evenly divided between benign and malignant cases. To improve diagnostic accuracy, models such as Gaussian NaĂŻve Bayes (GNB), CNNs, KNN, and MobileNetV2 were assessed employing performance measures including F1-score, recall, accuracy, and precision. The methodology involved data preprocessing techniques, including transfer learning and feature extraction, followed by data splitting for robust model training and evaluation. Findings indicate that MobileNetV2 achieved a highest accuracy99.4%, significantly outperforming GNB (87.2%), CNN (96.7%), and KNN (91.2%). The outstanding capacity of MobileNetV2 to identify between benign and malignant instances was shown by the investigation, which also made use of confusion matrices and ROC curves to evaluate model performance.Full article
Review Article
Open Access December 27, 2020 11 pages 479 views 72 downloads

An Effective Predicting E-Commerce Sales & Management System Based on Machine Learning Methods

Current Research in Public Health 2021, 1(1), 1110. DOI: 10.31586/jaibd.2020.1110
Abstract
Due to influence of Internet, this e-commerce sector has developed rapidly. Most of the online retailing or selling businesses are seeking for way for predicting their products demand. Sales forecasting may help retailers develop a sales strategy that will enhance sales and attract more money and investment. The current research work puts forward a machine learning framework to forecast E-commerce
[...] Read more.
Due to influence of Internet, this e-commerce sector has developed rapidly. Most of the online retailing or selling businesses are seeking for way for predicting their products demand. Sales forecasting may help retailers develop a sales strategy that will enhance sales and attract more money and investment. The current research work puts forward a machine learning framework to forecast E-commerce sales for strategic management using a dataset of E-commerce transactions. With 70 percent of the data for train and 30 percent for test, three models were produced, namely, Random Forest, Decision Tree, and XGBoost. In order to evaluate the models, performance measures inclusive of R-squared (R²) and Root Mean Squared Error (RMSE) were employed. Thus, the XGBoost model was the most accurate in marketing predictive capabilities for E-commerce sales with the R² score of 96.3%. This has demonstrated the increased capability of XGBoost algorithm to forecast E-commerce monthly sales more accurately than other models and can assist decision makers for managing inventory and arriving smart and quick decisions in this rapidly growing E-commerce market. The findings reiterate the importance of using advanced analytics in order to drive effectiveness and customer experience within E-commerce sector.Full article
Review Article
Open Access December 27, 2021 14 pages 4180 views 109 downloads

Leveraging AI in Urban Traffic Management: Addressing Congestion and Traffic Flow with Intelligent Systems

Current Research in Public Health 2021, 1(1), 1151. DOI: 10.31586/jaibd.2021.1151
Abstract
Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times.
[...] Read more.
Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. From an urban transportation standpoint, an immediate consideration on one hand is monitoring traffic conditions and demand cycles, while on the other hand inducing flow modifications that benefit the traffic network and mitigate congestion. Embedded and centralized control systems that characterize modern traffic management systems extract traffic conditions specific to their regions but lack communication between networks. Moreover, innovative methods are required to provide more accurate up-to-date traffic forecasts that characterize real-world traffic dynamics and facilitate optimal traffic management decisions. In this chapter, we briefly outline the main difficulties and complexities in modeling, managing, and forecasting traffic dynamics. We also compare various conventional and modern Intelligent Transportation Strategies in terms of accuracy and applicability, their performance, and potential opportunities for optimization of multimodal traffic flow and congestion reduction. This chapter introduces various proposed data-driven models and tools employed for traffic flow prediction and management, investigating specific strategies' strengths, weaknesses, and benefits in addressing various real-world traffic management problems. We describe that the design phase of dependable Intelligent Transportation Systems bears unique requirements in terms of the robustness, safety, and response times of their components and the encompassing system model. Furthermore, this architectural blueprint shares similarities with distributed coordinate searching and collective adaptive systems. Town size-independent models induce systemic performance improvements through reconfigurable embedded functionality. These AI techniques feature elaborate anytime planner-engagers ensuring near-optimal performances in an unbiased behavior when the model complexity is varied. Sustainable models minimize congestion during peaks, flooding, and emergency occurrences as they adhere to area-specific regulations. Security-aware and fail-safe traffic management systems relinquish reasonable assurances of persistent operation under various environmental settings, to acknowledge metropolis and complex traffic junctions. The chapter concludes by outlining challenges, research questions, and future research paths in the field of transportation management.Full article
Review Article
Open Access December 27, 2021 11 pages 185 views 35 downloads

Sustainability in Construction: Exploring the Development of Eco-Friendly Equipment

Current Research in Public Health 2021, 1(1), 1153. DOI: 10.31586/jaibd.2021.1153
Abstract
The equipment used in the construction industry is usually associated with a high impact on the environment. Although sustainable design has shown to be a main player among the initiatives focused on reducing environmental impact, it has been driven by the workers and processes, leaving the equipment endeavors in more restrictive and later stages. The equipment industry has been a constant target
[...] Read more.
The equipment used in the construction industry is usually associated with a high impact on the environment. Although sustainable design has shown to be a main player among the initiatives focused on reducing environmental impact, it has been driven by the workers and processes, leaving the equipment endeavors in more restrictive and later stages. The equipment industry has been a constant target of environmental standards and economic pressure, but the increasing technological development allows it to respond to sustainability and safety expectations while enhancing its performance. However, there are still several limitations that lead this sector to be one of the last to reach upgrading levels in terms of development. A study identified some gaps in the equipment design that require a greater effort to effectively support the workers and companies towards sustainable construction. This chapter is based on a study aiming to understand the consolidated knowledge of technologically sustainable equipment design and to identify the challenges left for its full development. The findings support the development of innovative eco-friendly equipment, taking into consideration sustainable materials and product guidelines, as well as green economy initiatives. It also supports complex system approaches and safety by design specificities to establish a corporate knowledge of sustainable equipment and align it with the new regulations of the construction industry. The chapter introduces the context of construction equipment in terms of new challenges when faced with the need to provide construction work with a greater capacity for safety, from an environmental and energy efficiency perspective, and within the paradigm of sustainability. Then, it presents the concept of sustainable equipment considering its principles, followed by a characterization of the agents involved in its life cycle.Full article
Review Article
Open Access December 27, 2021 15 pages 245 views 49 downloads

Financial Implications of Predictive Analytics in Vehicle Manufacturing: Insights for Budget Optimization and Resource Allocation

Current Research in Public Health 2021, 1(1), 1154. DOI: 10.31586/jaibd.2021.1154
Abstract
Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented
[...] Read more.
Factory owners and vehicle manufacturers increasingly opt for predictive analytics to inform their decisions. While predictive analytics have been proven to provide insights into the initiation of maintenance measures before a machine actually fails, the right models and features could have a significant impact on the budget spent and resources allocated. This means that financially oriented questions need to at least partially guide the decisions in the planning phase of data science projects. Data-driven approaches will play an increasingly important role, but only a few of the firms that were confident performed logistic regression models for predictive maintenance. Also, from the available knowledge, data-driven classification models connecting vehicle component failures and the occurrence of delays at the assembly line have not been published. This paper utilizes a real-world data-driven approach using classification models in predictive analytics by vehicle manufacturers and thereby links the financial implications of such data science projects to their results. We expand the existing literature on predictive maintenance and possess a unique dataset of newly launched series of vehicles, presented as-is. Our research context is of interest to researchers and practitioners in the automotive industry that manage and plan the final vehicle assembly with just-in-time principles, factoring the consequences of component failures on the assembly process. Key findings of this paper highlight that while minor tweaking of the models is possible, their potential input in decision-making processes for budget optimization is limited.Full article
Review Article
Open Access December 27, 2020 13 pages 156 views 44 downloads

Enhancing Pharmaceutical Supply Chain Efficiency with Deep Learning-Driven Insights

Current Research in Public Health 2021, 1(1), 1186. DOI: 10.31586/jaibd.2020.1186
Abstract
The growing complexity of the operating environment urges pharmaceutical innovation. This essay addresses the need for the integration of advanced technologies in the pharmaceutical supply chain. It justifies the value proposition and presents a concrete use case for the integration of deep learning insights to make data-driven decisions. The supply chain has always been a priority for the
[...] Read more.
The growing complexity of the operating environment urges pharmaceutical innovation. This essay addresses the need for the integration of advanced technologies in the pharmaceutical supply chain. It justifies the value proposition and presents a concrete use case for the integration of deep learning insights to make data-driven decisions. The supply chain has always been a priority for the pharmaceutical industry; research and development recognizes companies' increasing investment in big data strategies, with plans for a CAGR in big data tool adoption. The work presented herein has a preliminary explorative character to recuperate and integrate evidence from partly overlooked practical experience and know-how. The practical relevance of the essay is directed toward practitioners in pharmaceutical production, supply chain management, logistics, and regulatory agencies. The literature has shown a long-term concern for enhanced performance in the pharmaceutical supply chain network. This essay demonstrates the application of deep learning-driven insights to reveal non-evident flow dependencies. The main aim is to present a comprehensive insight into deep learning-driven decision support. The supply chain is portrayed in a holistic manner, seeking end-to-end visibility. Implications for public policy are discussed, such as data equity: many countries are protecting their populations and economic growth by building resilience and efficiency to ensure the capacity to move goods across supply chains. The implementation strategy is covered. The combined reduction of variability, efficiency as matured richness, reliability (on stochastic flows and their understanding through deep learning and data), and system noise (increased dampening through the inclusiveness of all stakeholders) results in increased responsiveness of supply chains for pharmaceutical products. Future work involves the integration of external data, closing the loop between planning and its application in reality.Full article
Review Article
Open Access December 27, 2021 12 pages 538 views 47 downloads

Predictive Analytics and Deep Learning for Logistics Optimization in Supply Chain Management

Current Research in Public Health 2021, 1(1), 1187. DOI: 10.31586/jaibd.2021.1187
Abstract
Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the
[...] Read more.
Managing supply chains efficiently has become a major concern for organizations. One of the important factors to optimize in supply chain management is logistics. The advent of technology and the increase in data availability allow for the enhancement of the efficiency of logistics in a supply chain. This discussion focuses on the blending of analytics with innovation in logistics to improve the operations of a supply chain. An approach is presented on how predictive analytics can be used to improve logistics operations. In order to analyze big data in logistics effectively, an artificial intelligence computational technique, specifically deep learning, is employed. Two case studies are illustrated to demonstrate the practical employability of the proposed technique. This reveals the power and potential of using predictive analytics in logistics to project various KPI values ahead in the future based on the contemporary data from the logistics operations; sheds light on the innovative technique of employing deep learning through deep learning-based predictive analytics in logistics; suggests incorporating innovative techniques like deep learning with predictive analytics to develop an accurate forecasting technique in logistics and optimize operations and prevent disruption in the supply chain. The network of supply chains has become more complex, necessitating the need for the latest technological advancements. The sectors that have gained a fair amount of attention for the application of technology to optimize their operations are manufacturing, healthcare, aerospace, and the automotive industry. A little attention has been diverted to the logistics sector; many describe how analytics and artificial intelligence can be used in the logistics sector to achieve higher optimization. Currently, significant research has been done in optimizing logistics operations. Nevertheless, with the explosive volume of historical data being produced by the logistics operations of an organization, there is a great opportunity to learn valuable insights from the data accumulated over time for more long-term strategic planning. To develop the logistics operations in an organization, the use of historical data is essential to understand the trends in the operations. For example, regular maintenance planning and resource allocation based on trends are long-term activities that will not affect logistics operations immediately but can affect the business’s strategic planning in the long run. A predictive analysis technique employed on historical data of logistics can narrow down conclusions based on the future trends of logistics operations. Thus, the technique can be used to prevent the disruption of the supply chain.Full article
Review Article
Open Access December 29, 2019 12 pages 337 views 43 downloads

Explainable Analytics in Multi-Cloud Environments: A Framework for Transparent Decision-Making

Current Research in Public Health 2021, 1(1), 1228. DOI: 10.31586/jaibd.2019.1228
Abstract
The multitude of services and resources available in multi-cloud environments has increased the importance of analytics applications in cloud brokering. These applications can orchestrate services and resources that reside in different domains and require inputs that a single cloud provider could not easily acquire. Yet, despite their distinct characteristics, multi-cloud analytics users have no
[...] Read more.
The multitude of services and resources available in multi-cloud environments has increased the importance of analytics applications in cloud brokering. These applications can orchestrate services and resources that reside in different domains and require inputs that a single cloud provider could not easily acquire. Yet, despite their distinct characteristics, multi-cloud analytics users have no voice in the ranking of the services in brokerage marketplaces. In this chapter, we introduce the concept and propose the implementation of explainable analytics to increase transparency and user satisfaction in multi-cloud environments. The criteria that we have identified and measured in order to summarize them in explainable results allow cloud users to acquire an understanding of the ranking rules, a crucial requirement in trustful decision-making. Our proposal accounts for a set of regulations for intelligent systems and targets their specific adaptation and use in multi-cloud environments.Full article
Review Article
Open Access December 29, 2020 17 pages 312 views 33 downloads

A Deep Learning Architectures for Enhancing Cyber Security Protocols in Big Data Integrated ERP Systems

Current Research in Public Health 2021, 1(1), 1238. DOI: 10.31586/jaibd.2020.1238
Abstract
Deep learning approaches are very useful to enhance cybersecurity protocols for industry-integrated big data enterprise resource planning systems. This research study develops deep learning architectures of variational autoencoder, sparse autoencoder, and deep belief network for detecting anomalies, fraud, and preventing cybersecurity attacks. These cybersecurity issues occur in finance, human
[...] Read more.
Deep learning approaches are very useful to enhance cybersecurity protocols for industry-integrated big data enterprise resource planning systems. This research study develops deep learning architectures of variational autoencoder, sparse autoencoder, and deep belief network for detecting anomalies, fraud, and preventing cybersecurity attacks. These cybersecurity issues occur in finance, human resources, supply chain, and marketing in the big data integrated ERP systems or cloud-based ERP systems. The main objectives of this creative research work are to identify the vulnerabilities in various ERP systems, databases, and the interconnected domains; to introduce a conceptual cybersecurity network model that incorporates variational autoencoders, sparse autoencoders, and deep belief networks; to evaluate the performance of the proposed cybersecurity model by employing the appropriate parameters with real-time and synthetic databases and simulated scenarios; and to validate the model performance by comparing it with traditional algorithms. A big data platform with an integrated business management system is known as an integrated ERP system, which plays an instrumental role in conducting business for various organizations in society. In recent times, as uncertainty and disparity increase, the cyber ecosystem becomes more complex, volatile, dynamic, and unpredictable. In particular, the number of cyber-attacks is increasing at an alarming rate; the resultant security breaches have a disruptive and disturbing effect on businesses around the world, with a loss of billions of dollars. To combat these threats, it is essential to develop a conceptual cybersecurity network model to secure systems by functioning as a mutually supporting and strengthening network model rather than working in isolation. In this dynamic and fluid environment, introducing a deep learning approach helps to support and prevent fraud and other illicit activities related to human resources and the supply chain, among others. Some cybersecurity vulnerabilities include, for example, database vulnerabilities, service level vulnerabilities, and system vulnerabilities, among others. The proposed methodology focuses only on database vulnerabilities, with the main aim of detecting and mitigating new potential vulnerabilities in other dependent domains as a future initiative.Full article
Review Article
Open Access December 27, 2019 12 pages 307 views 28 downloads

The Role of Neural Networks in Advancing Wearable Healthcare Technology Analytics

Current Research in Public Health 2019, 1(1), 1251. DOI: 10.31586/jaibd.2019.1251
Abstract
Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical
[...] Read more.
Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical healthcare technology, crawling through some industry giants. Wearable Healthcare Technologies are becoming more popular every day. These technologies facilitate collecting, monitoring, and sharing every vital aspect of the human body necessary for diagnosing and treating an ailment. At the advent of global digitization, health data storage and systematic analysis are taking shape to ensure better diagnostics, preventive, and predictive healthcare. Healthcare analytics powered by neural networks can significantly improve health outcomes, maximizing individuals' potential and quality of life. The breadth and possibilities of connected devices are getting wider. From personal activity monitoring to quantifying every bit of health statistics, connected devices are making an impact in measurement, management, and manipulation. In healthcare, early diagnosis could be a lifesaver. Data analytics can help in a big way to make moves and predictions to save lives. We are in another phase of the digitization era, "Neural Network and Wearable Healthcare Technology Analytics." A neural network could be conceived as an adaptive system made up of a large number of neurons connected in multiple layers. A neural network processes data in a similar way as the human brain does. Using a collection of algorithms, for many neural networks, objects are composed of 'input' and 'output' layers along with the layers of the neural network.Full article
Review Article
Open Access December 27, 2019 14 pages 437 views 42 downloads

Predictive Analytics in Biologics: Improving Production Outcomes Using Big Data

Current Research in Public Health 2019, 1(1), 1256. DOI: 10.31586/jaibd.2019.1256
Abstract
Biopharmaceuticals, or biologics, are a burgeoning sector in the pharmaceutical industry, predicted to reach $239.4 billion by 2025. This unparalleled growth is often attributed to the enhanced specificity offered by large molecules over small molecules. The large size of the constituent proteins necessitates the continuous implementation of big data predictive analytics to elucidate the most
[...] Read more.
Biopharmaceuticals, or biologics, are a burgeoning sector in the pharmaceutical industry, predicted to reach $239.4 billion by 2025. This unparalleled growth is often attributed to the enhanced specificity offered by large molecules over small molecules. The large size of the constituent proteins necessitates the continuous implementation of big data predictive analytics to elucidate the most effective candidates in the lead optimization process. These same methodologies can be applied, and with the advent of machine learning and automated predictive analytics, this is becoming an increasingly facile task, to the augmentation and optimization of the downstream production processes that comprise the majority of the development cost of any biologic. In this work, big data from cell line generation, product and process design, and large-scale lead validation studies have been used to compare the applicability of simple statistical models against these black-box approaches for the rapid acceleration of enzymes to the pilot plant stage. This research can be expanded upon to exploit the big datasets generated as part of the progression of biologics through the development pipeline to further optimize production outcomes. Over the coming months, data from the project will be used to probe which approaches are amenable to which processes and, as a result, more amenable to various economic simulations. The computed optimization objective for the HIT must include the cost of acquiring, storing, and analyzing data to construct these predictive models, alongside the expected commercial reward of choosing an optimally ranked candidate. In this vein, perspective must be taken in the probable future price, capability outputs, and ownership issues of increasingly sophisticated data analysis software as superstructures become more frequent. It is frequently stated that decisions made to reduce production costs are data-driven, but that is not because more economically or energetically costly experiments or production methods are employed; to truly evaluate production steps, dynamic energy, and economic models need to become more commonplace. Conversion of process quality approaches from large questionnaires, risk analysis, and expert opinion-driven methods to statistical and thus more reliable approaches is an area of future research in analytics used herein.Full article
Review Article
Open Access December 21, 2016 10 pages 259 views 31 downloads

Advanced Natural Language Processing (NLP) Techniques for Text-Data Based Sentiment Analysis on Social Media

Current Research in Public Health 2021, 1(1), 1293. DOI: 10.31586/jaibd.2016.1293
Abstract
The field of sentiment analysis is a crucial aspect of natural language processing (NPL) and is essential in discovering the emotional undertones within the text data and, hence, capturing public sentiments over a variety of issues. In this regard, this study suggests a deep learning technique for sentiment categorization on a Twitter dataset that is based on Long Short-Term Memory (LSTM)
[...] Read more.
The field of sentiment analysis is a crucial aspect of natural language processing (NPL) and is essential in discovering the emotional undertones within the text data and, hence, capturing public sentiments over a variety of issues. In this regard, this study suggests a deep learning technique for sentiment categorization on a Twitter dataset that is based on Long Short-Term Memory (LSTM) networks. Preprocessing is done comprehensively, feature extraction is done through a bag of words method, and 80-20 data is split using training and testing. The experimental findings demonstrate that the LSTM model outperforms the conventional models, such as SVM and Naïve Bayes, with an F1-score of 99.46%, accuracy of 99.13%, precision of 99.45%, and recall of 99.25%. Additionally, AUC-ROC and PR curves validate the model’s effectiveness. Although, it performs well the model consumes heavy computational resources and longer training time. In summary, the results show that deep learning performs well in sentiment analysis and can be used to social media monitoring, customer feedback evaluation, market sentiment analysis, etc.Full article
Review Article
Open Access December 27, 2021 11 pages 239 views 27 downloads

An Analysis of Crime Prediction and Classification Using Data Mining Techniques

Current Research in Public Health 2021, 1(1), 1334. DOI: 10.31586/jaibd.2021.1334
Abstract
Crime is a serious and widespread problem in their society, thus preventing it is essential. Assignment. A significant number of crimes are committed every day. One tool for dealing with model crime is data mining. Crimes are costly to society in many ways, and they are also a major source of frustration for its members. A major area of machine learning research is crime detection. This paper
[...] Read more.
Crime is a serious and widespread problem in their society, thus preventing it is essential. Assignment. A significant number of crimes are committed every day. One tool for dealing with model crime is data mining. Crimes are costly to society in many ways, and they are also a major source of frustration for its members. A major area of machine learning research is crime detection. This paper analyzes crime prediction and classification using data mining techniques on a crime dataset spanning 2006 to 2016. This approach begins with cleaning and extracting features from raw data for data preparation. Then, machine learning and deep learning models, including RNN-LSTM, ARIMA, and Linear Regression, are applied. The performance of these models is evaluated using metrics like Root Mean Squared Error (RMSE) and Mean Absolute Percentage Error (MAPE). The RNN-LSTM model achieved the lowest RMSE of 18.42, demonstrating superior predictive accuracy among the evaluated models. Data visualization techniques further unveiled crime patterns, offering actionable insights to prevent crime.Full article
Article
Open Access December 29, 2020 24 pages 80 views 12 downloads

Enhancing Government Fiscal Impact Analysis with Integrated Big Data and Cloud-Based Analytics Platforms

Current Research in Public Health 2021, 1(1), 1339. DOI: 10.31586/jaibd.2020.1339
Abstract
While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that
[...] Read more.
While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that facilitates data retrieval and analytics, as well as policy modelling, creation and optimization. The environment enables data collection from heterogeneous sources, linking and aggregation, complemented with data cleaning and interoperability techniques. An innovative approach for analytics as a service is introduced and linked with a policy development toolkit, which is an integrated web-based environment to fulfil the requirements of the public policy ecosystem stakeholders [1]. Large information databases on various public issues exist, but their usage for public policy formulation and impact analysis has been limited so far, as no cloud-based service ecosystem exists to facilitate their efficient exploitation. With the increasing availability and importance of both public big and traditional data, the need to extract, link and utilize such information efficiently has arisen. Current data-driven web technologies and models are not aligned with the needs of this domain, and therefore, potential candidates for big data, cloud-based and service-oriented public policy analysis solutions should be investigated, piloted and demonstrated [2]. This paper presents the conceptual architecture of such an ecosystem based on the capabilities of state-of-the-art cloud and web technologies, as well as the requirements of its users.Full article
Review Article
Open Access December 21, 2021 19 pages 29 views 19 downloads

Optimizing Data Warehousing for Large Scale Policy Management Using Advanced ETL Frameworks

Current Research in Public Health 2021, 1(1), 1350. DOI: 10.31586/jaibd.2021.1350
Abstract
Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the
[...] Read more.
Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the need for data warehousing. Next, an overview of an ETL framework is presented, along with a discussion of advanced ETL techniques. The chapter concludes with an outline of performance optimization techniques for data warehousing. Data warehousing is considered a key enabler for efficient reporting and analysis, with implementation choices ranging from cost-effective desktop systems to large-scale, mission-critical data marts and warehouses containing petabytes of data. Extract, transform, and load (ETL) systems remain one of the largest cost and effort areas within data warehouse development projects, requiring significant planning and resources to build, manage, and monitor the flow of data from source systems into the data warehouse. The technology and techniques used for ETL can greatly influence the success or failure of a data warehouse. Complex business requirements for data cleansing, loading, transformation, and integration have intensified, while operational plans for real-time and near-real-time reporting add additional challenges. Parallel loading mechanisms, incremental data loading, and runtime update and insert strategies not only improve ETL performance but also optimize data warehousing performance, particularly for large-scale policy management.Full article
Article
Open Access December 22, 2020 17 pages 64 views 42 downloads

Cloud Migration Strategies for High-Volume Financial Messaging Systems

Current Research in Public Health 2021, 1(1), 1353. DOI: 10.31586/jaibd.2020.1353
Abstract
Key business objectives for digital infrastructure cloud adoption are often framed in terms of reducing cost, improving fault tolerance and resilience, simplifying scale, and enabling innovation. Given the critical nature of the financial sector, however, where timeliness and price can significantly determine an outcome, cloud migration in delivery environments demands greater throughput on the
[...] Read more.
Key business objectives for digital infrastructure cloud adoption are often framed in terms of reducing cost, improving fault tolerance and resilience, simplifying scale, and enabling innovation. Given the critical nature of the financial sector, however, where timeliness and price can significantly determine an outcome, cloud migration in delivery environments demands greater throughput on the critical path and, in many enterprise-scale settings, forgoes hybrid complexity and multi-cloud risks. Nevertheless, slack in system designs does exist; financial institutions enable market functionality—trading, clearing/best execution—despite potentially being able to meet such sets with lower service levels than other verticals. A cloud multi-account structure for sensitive data, for example, naturally limits exposure when combined with observed risk. Fulfilling predictions of elasticity during periods of high demand usually requires support from a dedicated environment (or environments) located nearer to the operations. Components can consequently be allocated on a per-account basis or maintained as shared sink systems to which the dedicated streams write. The automation code can similarly be targeted for dedicated accounts, avoiding the resource constraints that beset such operations during industry events like emergency triage/contact desking.Full article
Review Article
Open Access July 20, 2021 13 pages 9 views 2 downloads

Quality of Experience (QoE) and Network Performance Modelling for Multimedia Traffic

Current Research in Public Health 2021, 1(1), 1358. DOI: 10.31586/jaibd.2021.1358
Abstract
This research explores the complex relationship between user-perceived Quality of Experience (QoE) and underlying network performance for multimedia traffic. As video streaming, online gaming, and interactive media dominate modern networks, ensuring consistent QoE has become a key challenge. The study develops a network performance model that integrates objective Quality of Service (QoS)
[...] Read more.
This research explores the complex relationship between user-perceived Quality of Experience (QoE) and underlying network performance for multimedia traffic. As video streaming, online gaming, and interactive media dominate modern networks, ensuring consistent QoE has become a key challenge. The study develops a network performance model that integrates objective Quality of Service (QoS) parameters—such as delay, jitter, packet loss, and throughput—with subjective QoE metrics like Mean Opinion Score (MOS) and perceptual quality indices. Using simulation-based and analytical approaches, the paper evaluates how network conditions affect multimedia traffic behavior and user satisfaction. The results highlight critical thresholds for QoE degradation, enabling predictive modeling for adaptive multimedia delivery and real-time optimization. This work contributes to designing intelligent, user-centered network management systems capable of balancing resource efficiency and end-user satisfaction.Full article
Review Article
Open Access December 27, 2021 14 pages 54 views 5 downloads

Best Practices of CI/CD Adoption in Java Cloud Environments: A Review

Current Research in Public Health 2021, 1(1), 1356. DOI: 10.31586/jaibd.2021.1356
Abstract
The continuous integration (CI) and continuous delivery/deployment (CD) methods are key tools in the field of modern software development, and they assist in the rapid, reliable and quality delivery of software. These DevOps methods are automated, and the code development, testing, and deployment processes are streamlined, which reduces the risk of integration, enhances productivity, and minimizes
[...] Read more.
The continuous integration (CI) and continuous delivery/deployment (CD) methods are key tools in the field of modern software development, and they assist in the rapid, reliable and quality delivery of software. These DevOps methods are automated, and the code development, testing, and deployment processes are streamlined, which reduces the risk of integration, enhances productivity, and minimizes human labor. To implement CI/CD, Java cloud applications can utilize cloud-native services such as AWS Code Pipeline, Azure DevOps, and Google Cloud Build, as well as tools like Jenkins, GitLab CI/CD, GitHub Actions, CircleCI, Travis CI, and Bamboo. Basic concepts of CI/CD include automation, regular integration, testing, intensive testing, constant feedback, and process improvement. Some of the major pipeline phases include deployment, monitoring, testing, artefact management, build automation, and source code management. Despite clear benefits, challenges remain, including infrastructure complexity, dependency management, test reliability, and cultural barriers, particularly in large-scale or enterprise Java projects. This work provides a thorough analysis of CI/CD procedures and resources, including frameworks, best practices, and challenges for Java cloud applications. It highlights strategies to optimize adoption, improve software quality, and accelerate delivery cycles.Full article
Review Article
Open Access June 28, 2016 10 pages 221 views 28 downloads

Scalable Task Scheduling in Cloud Computing Environments Using Swarm Intelligence-Based Optimization Algorithms

Current Research in Public Health 2021, 1(1), 1291. DOI: 10.31586/jaibd.2016.1291
Abstract
Effective task scheduling in cloud computing is crucial for optimizing system performance and resource utilization. Traditional scheduling methods often struggle to adapt to the dynamic and complex nature of cloud environments, where workloads, resource availability, and task requirements constantly change. Swarm intelligence-based optimization algorithms, such as Particle Swarm Optimization
[...] Read more.
Effective task scheduling in cloud computing is crucial for optimizing system performance and resource utilization. Traditional scheduling methods often struggle to adapt to the dynamic and complex nature of cloud environments, where workloads, resource availability, and task requirements constantly change. Swarm intelligence-based optimization algorithms, such as Particle Swarm Optimization (PSO), Ant Colony Optimization (ACO), and Artificial Bee Colony (ABC), offer a promising solution by mimicking natural processes to explore large search spaces efficiently. These algorithms are effective in balancing multiple objectives, including minimizing execution time, reducing energy consumption, and ensuring fairness in resource allocation. They also enhance system scalability, which is vital for modern cloud infrastructures. However, challenges remain, including slow convergence speeds, complex parameter tuning, and integration with existing cloud frameworks. Addressing these issues will be essential for the practical implementation of swarm intelligence in cloud task scheduling, helping to improve resource management and overall system performance.Full article
Review Article
ISSN: 2831-5162
DOI prefix: 10.31586/crph
Journal metrics
Publication year
2021-2026
Journal (home page) visits
46561
Published articles
29
Article views
39787
Article downloads
7494
Downloads/article
258
APC
99.00