Current Research in Public Health
Volume 1, Issue 1, 2021
Open Access July 10, 2022 14 pages 794 views 214 downloads

Spray Coated Cellulose Nanofiber (CNF) Film as an Eco-Friendly Substrate for Flexible and Printed Electronics

Current Research in Public Health 2022, 1(1), 352. DOI: 10.31586/ojes.2022.352
Abstract
Cellulose nanofiber is an eco-friendly nanomaterial used for fabricating various functional materials. It is an alternative for synthetic plastic and other petroleum derived materials. Due to demand of CNF film, fast and rapid method for fabrication of CNF film is required. A new method on spray coating to prepare smooth cellulose nanofiber (CNF) films was developed. In this method, spraying CNF
[...] Read more.
Cellulose nanofiber is an eco-friendly nanomaterial used for fabricating various functional materials. It is an alternative for synthetic plastic and other petroleum derived materials. Due to demand of CNF film, fast and rapid method for fabrication of CNF film is required. A new method on spray coating to prepare smooth cellulose nanofiber (CNF) films was developed. In this method, spraying CNF suspension onto a smooth and polished metal surface was carried out and then allowed the spray coated wet film to dry in air under standard laboratory conditions. Spraying has notable advantages such as contour coating and contactless coating with the base substrate. The basis weight and thickness of the CNF film is tailorable by adjusting CNF suspension in spraying process. CNF film prepared via spray coating has unique two-sided surface roughness with the surface in contact with the base substrate or metal side much smoother than the air-contact side. The surface roughness is one of the controlling parameter in the application of the CNF film as a substrate for flexible and printed electronics. The RMS roughness of the two surfaces investigated by Optical Profilometry [OP] was found to be 2087 nm on the rough side and 389 nm on the spray coated side, respectively. The spray coated CNF film has ultra-high smoothness on the side exposed to the polished stainless steel surface. The factors including the size of cellulose fibrils and surface smoothness of base surface that control the roughness of the film are currently being investigated and will be discussed in this chapter. The surface smoothness requirements for substrate applications in flexible and printed electronics will be discussed.Full article
Concept Paper
Open Access June 09, 2022 23 pages 454 views 158 downloads

Theoretical and Experimental Analysis of Miniaturization of Conventional Oscillatory Flow Technology

Current Research in Public Health 2022, 1(1), 296. DOI: 10.31586/ojes.2022.296
Abstract
The requirement for any configuration of a chemical or biochemical reactor is the presence of efficient mixing to enhance heat and mass transfer as needed for the application of interest. Furthermore, as an Oscillatory Flow (OF) reactor has a combination of flow oscillation and baffled tube configuration, which has the potential to ensure efficient mixing, heat transfer, and mass transfer. In this
[...] Read more.
The requirement for any configuration of a chemical or biochemical reactor is the presence of efficient mixing to enhance heat and mass transfer as needed for the application of interest. Furthermore, as an Oscillatory Flow (OF) reactor has a combination of flow oscillation and baffled tube configuration, which has the potential to ensure efficient mixing, heat transfer, and mass transfer. In this way, an efficient mixing in an OF reactor is able to tackle any type of resistance in any chemical process from polymer synthesis to enzyme production. It has been observed that an OF reactor improved both conversion and selectivity of the relevant reaction by efficient mixing and transport properties. However, this technology was not still extended to mini-fluidic configuration via process intensification methods and so far, a novel approach for enhanced mixing at reduced scales was not explored. This work explores the application of OF technology in mini-fluidics. The feasibility analysis of Oscillatory Flow Technology in mini channels has been investigated using theoretical correlations from Conventional Oscillatory flow technology in process equipment. As a preliminary step in the process intensification of OF technology in mini channels, The Nusselt number (Nu) and pressure drop values are predicted from the literature and it has been observed that the transfer operations are also improved when oscillatory flow is applied in mini channels compared to commercial mini contactors such as corning heart shaped reactor. The plot between energy dissipation vs. mixing evaluated from theoretical calculations was drawn and compared with mini-fluidic mixers reported in literature. The most common mini-fluidic mixer is corning heart shaped reactor used for comparison with the proposed minichannel. Because of this analysis, the novel mixing geometries was expected to develop for various chemical processing applications. The OFT experimental set up was developed to create oscillatory flow via either forward rotation or backward rotation of valve. Furthermore, pressure vs. time profile and flow vs. time profile for the given OF mini fluidic arrangement is initially investigated and described. Preliminary experimental results are provided for an OF generator, intended for use in subsequent experiments exploring mini-fluidic mixers with OF technology.Full article
Article
Open Access February 24, 2022 16 pages 401 views 186 downloads

Computational Fluid Dynamics Modeling of Thermally Integrated Microchannel Reforming Reactors for Hydrogen Production

Current Research in Public Health 2022, 1(1), 228. DOI: 10.31586/ojes.2022.228
Abstract
Many attempts have been made to improve heat transfer for thermally integrated microchannel reforming reactors. However, the mechanisms for the effects of design factors on heat transfer characteristics are still not fully understood. This study relates to a thermochemical process for producing hydrogen by the catalytic endothermic reaction of methanol with steam in a thermally integrated
[...] Read more.
Many attempts have been made to improve heat transfer for thermally integrated microchannel reforming reactors. However, the mechanisms for the effects of design factors on heat transfer characteristics are still not fully understood. This study relates to a thermochemical process for producing hydrogen by the catalytic endothermic reaction of methanol with steam in a thermally integrated microchannel reforming reactor. Computational fluid dynamics simulations are conducted to better understand the consumption, generation, and exchange of thermal energy between endothermic and exothermic processes in the reactor. The effects of wall heat conduction properties and channel dimensions on heat transfer characteristics and reactor performance are investigated. Thermodynamic analysis is performed based on specific enthalpy to better understand the evolution of thermal energy in the reactor. The results indicate that the thermal conductivity of the channel walls is fundamentally important. Materials with high thermal conductivity are preferred for the channel walls. Thermally conductive ceramics and metals are well-suited. Wall materials with poor heat conduction properties degrade the reactor performance. Reaction heat flux profiles are considerably affected by channel dimensions. The peak reaction heat flux increases with the channel dimensions while maintaining the flow rates. The change in specific enthalpy is positive for the exothermic reaction and negative for the endothermic reaction. The change in specific sensible enthalpy is always positive. Design recommendations are made to improve thermal performance for the reactor.Full article
Article
Open Access December 15, 2021 10 pages 2705 views 280 downloads

Dissemination and Exploitation of Regional Meteo-Hydrological Datasets through Web-based Interactive Applications: The SOL System Case Study

Current Research in Public Health 2022, 1(1), 180. DOI: 10.31586/ojes.2021.180
Abstract
The effects of climate change are already being felt in several parts of the World. Variability of changing rainfall intensity, drought and weather patterns contribute to determining the vulnerability of many human activities such as agriculture. In the next future, climate change considerations will depend on having appropriate strategies such as strengthen implementation agencies working in a
[...] Read more.
The effects of climate change are already being felt in several parts of the World. Variability of changing rainfall intensity, drought and weather patterns contribute to determining the vulnerability of many human activities such as agriculture. In the next future, climate change considerations will depend on having appropriate strategies such as strengthen implementation agencies working in a coordinated manner and with a data-driven approach in order to ensure monitoring, reporting and data verification. In this context, national and regional meteorological Services are facing with high demand for timely and quality information, services and products. A web-based interactive application with the aim of disseminating meteo-hydrological information at regional scale is described in this paper. The web application is built on a relational database and client-side programming has been used for implementing the user interface and controlling the web page behavior. The combination of PHP (Hypertext Preprocessor, a general-purpose scripting language, especially suited to server-side web development) and JavaScript (high-level object-oriented scripting language, nowadays the dominant client-side scripting language of the Web) has been chosen for this reason, since such software is free to use for everyone. The SOL system, developed on behalf of Marche region, Italy, was chosen as a case study, due to its multi-source data framework and because of the processing and public dissemination of several ad-hoc data elaborations.Full article
Case Study
Open Access August 21, 2021 18 pages 602 views 220 downloads

Virologic Microparticle Fluid Mechanics Simulation: COVID-19 Transmission in the Protected and Unprotected Conversations

Current Research in Public Health 2021, 1(1), 94. DOI: 10.31586/ojes.2021.010101
Abstract
SARS-COV-19 is a serious respiratory infection created by a devastating coronavirus family (2019-nCoV) that has become the first global epidemic of the last one hundred years. It is a highly transmissible virus transmitted by inhalation or contact with the droplet core produced by infected people when they sneeze, cough, and speak. SARS-COV-2 transmission in the air is possible even in a confined
[...] Read more.
SARS-COV-19 is a serious respiratory infection created by a devastating coronavirus family (2019-nCoV) that has become the first global epidemic of the last one hundred years. It is a highly transmissible virus transmitted by inhalation or contact with the droplet core produced by infected people when they sneeze, cough, and speak. SARS-COV-2 transmission in the air is possible even in a confined space near the infected person. This study aimed to evaluate the effectiveness of using a shield or mask as a barrier to a patient’s face against the spread of virus particles. For the present simulation, the discrete phase model (DPM) is used; Because this model allows us to study the particle’s mass discretely in a fluid space with the continuous phase. Due to the choice of this model, the virus particles secreted from the patient’s mouth are considered a discrete phase, and the open airflow in the computational area is considered a continuous phase. The present study uses fluent 2019R3 software to simulate the virus transmission to model the transient flows numerically. The analysis found that the masks or shields can be an effective method of protecting the participants of a conversation in the presence of an infected person.Full article
Review Article
Open Access December 26, 2020 15 pages 410 views 37 downloads

Green Cloud Computing: Strategies for Building Sustainable Data Center Ecosystems

Current Research in Public Health 2020, 1(1), 1229. DOI: 10.31586/ojes.2020.1229
Abstract
Green cloud computing is part of endeavors to develop sustainable data center ecosystems and, more importantly, nurtures a mindful alignment between environmental considerations and our cloud computing practices. This view is reinforced with the requirements of resource and energy minimization, as well as clean computing. This paper surveys the current practices, strategies, and significant
[...] Read more.
Green cloud computing is part of endeavors to develop sustainable data center ecosystems and, more importantly, nurtures a mindful alignment between environmental considerations and our cloud computing practices. This view is reinforced with the requirements of resource and energy minimization, as well as clean computing. This paper surveys the current practices, strategies, and significant aspects involved in moving towards green cloud computing, providing energy-efficient data centers. The energy efficiency criteria call for unified strategies in power-proportional components, big data storage, server systems, and power supply units to save holistic energy. In addition, there are significant challenges in moving towards green cloud computing for service providers and data center operators. We address various energy-conscious resource management technologies and discuss the importance of developing innovative, effective green management solutions. Data centers are ubiquitous but inherently more conspicuous to begin to see the urgency of making them sustainable in our ecological environment. With this in mind, this paper encapsulates the multidimensional issues and increased complexities of bringing up green solutions in cloud computing practices and provides guidance and potential strategies. We outline, realign, and insist on adopting strategies in practice not only from the technical aspect but also in strengthening partnerships and investigating strategies to further dissect challenges, converge solutions, and consider our impact in even more areas of study.Full article
Review Article
Open Access December 26, 2018 17 pages 168 views 10 downloads

Understanding Consumer Behavior in Integrated Digital Ecosystems: A Data-Driven Approach

Current Research in Public Health 2020, 1(1), 1337. DOI: 10.31586/ojes.2018.1337
Abstract
This study aims to achieve a new understanding of how, why, and when consumer behavior is shaped, enacted, and experienced inside and across integrated digital ecosystems related to large-scale trackable goods, all in service of helping marketers optimize their business performance in the new economy. The pioneering understanding begins by exploring what motivates the choices of a homogeneous
[...] Read more.
This study aims to achieve a new understanding of how, why, and when consumer behavior is shaped, enacted, and experienced inside and across integrated digital ecosystems related to large-scale trackable goods, all in service of helping marketers optimize their business performance in the new economy. The pioneering understanding begins by exploring what motivates the choices of a homogeneous group of consumers to organize their consumption of national and store brand varieties of consumer package goods in a certain manner. Thereafter, the essay explores how, if at all, the other digital activities of consumers across various product-related digital spaces and on various platforms build expertise and interest in these products such that they exert an effect on the purchase choices for these products. The essay then advances to asking how online information seeking, in various product-related digital spaces, on various platforms, and from various sources, and taking place at various points in the purchase journey affects online-offline dynamics in purchasing these products. Thereafter, the research examines how paid digital communication in various product-related digital spheres and forms, enabled by consumer advertising engagement on various platforms, boosts the offline sales of these products. Finally, by employing a new methodology that combines consumer scanning data, self-reported online activity data, and transaction data collected from an ad-tech partner, the research presents a fresh set of marketing action levers and performance outcomes on chosen products. Along the way, progress is made on four under-investigated topics in the advertising literature – the role of consumer actors and their expertise in the online-offline purchasing dynamics for ads, advertising engagement, consumer digital spaces, and consumer digital activity investment.Full article
Review Article
Open Access November 24, 2022 12 pages 78 views 15 downloads

Bridging Traditional ETL Pipelines with AI Enhanced Data Workflows: Foundations of Intelligent Automation in Data Engineering

Current Research in Public Health 2020, 1(1), 1345. DOI: 10.31586/ojes.2022.1345
Abstract
Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data
[...] Read more.
Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data Engineering and Automation framework offers the groundwork for intelligent automation processes. However, ML/AI are not the only disruptive forces; new Big Data technologies inspired by Web2.0 companies are also reshaping the Internet. Companies having the largest Big Data footprints not only provide applications with a Big Data operational model but also source their competitive advantage from data in the form of AI services and, consequently, impact the cost/performance equilibrium of ETL pipelines. All these technologies and reasons help explain why the traditional ETL pipeline design should adapt to current and emerging technologies and may be enhanced through artificial intelligence.Full article
Article
Open Access December 26, 2020 12 pages 39 views 33 downloads

Automated Vulnerability Detection and Remediation Framework for Enterprise Databases

Current Research in Public Health 2020, 1(1), 1354. DOI: 10.31586/ojes.2020.1354
Abstract
Enterprise databases are the heart of applications and contain the most sensitive and critical information of organizations. While there have been significant advances in the security of databases, vulnerabilities still exist due to mistakes made by application developers, database administrators, and users. Manual detection and patching of such vulnerabilities typically take months, but an
[...] Read more.
Enterprise databases are the heart of applications and contain the most sensitive and critical information of organizations. While there have been significant advances in the security of databases, vulnerabilities still exist due to mistakes made by application developers, database administrators, and users. Manual detection and patching of such vulnerabilities typically take months, but an automated detection and remediation framework is proposed to fill the gap and eliminate a significant number of these vulnerabilities in near-real time. This framework comprises two key components: a detection engine that leverages static analysis to identify potential patches, coupled with query dynamic testing and fuzzing to identify exploitable misconfigurations; and an orchestration engine that applies detected patches on the database, validates the accuracy of the fix, and rolls back changes if the problem is not resolved. A prototype of this framework has been implemented and validated on a real-time database deployed in an enterprise environment. Because of the complexity of the problem landscape, the research focus is on static vulnerability detection and automated corrective actions. These two capabilities can greatly reduce the manual workload associated with vulnerability detection and significantly enhance the assurance that the granted privileges validate the least privilege principle. The proposed architecture aims to enable the deployment of a detection-and-remediation solution that minimizes human effort, reduces the enterprise-at-risk window, and maximizes the volume of detected vulnerabilities.Full article
Review Article
Open Access December 26, 2021 14 pages 1 views 0 downloads

Rule-Based Automation for IT Service Management Workflows

Current Research in Public Health 2020, 1(1), 1360. DOI: 10.31586/ojes.2020.1360
Abstract
The automation of IT Service Management (ITSM) workflows using explicit rules and data has been established for years. Domain-specific rule engines interpret rules written in declarative rule modelling languages and generate forwarding arrows to process event streams and support decision making. Such automation is augmented by rule-driven Quality Assurance for correctness, safety, and risk
[...] Read more.
The automation of IT Service Management (ITSM) workflows using explicit rules and data has been established for years. Domain-specific rule engines interpret rules written in declarative rule modelling languages and generate forwarding arrows to process event streams and support decision making. Such automation is augmented by rule-driven Quality Assurance for correctness, safety, and risk management. The service desk is the onshore base of an ITSM supply chain. An end-to-end incident response service resolves incidents using only onshore resources and employs back office teams to help with unresolvable incidents. The forward factories of rule-based automation for ticket processing service are identified. Several rule-based workflows in incident and change management have been published. Further glimpses of the future across all ITSM workflows are provided based on training in an online ITSM service with automated operations. Rule engines are specialised components that direct the processing of data flows according to pre-defined rules. Decision factories complement the more common event-driven rule engines. While event processing occurs below the polling frequency of the source, rules in decision factories are triggered based on the arrival of data. These factories are applied in ITSM for risk and safety evaluation and quality assurance. Rule-enriched architectures incorporate domain-specific modelling languages to ensure correctness with respect to qualitative quality attributes. Dedicated factories provide resilience, detect slack or over-utilisation, and offer point-in-time assurance and testing.Full article
Review Article
ISSN: 2831-5162
DOI prefix: 10.31586/crph
Journal metrics
Publication year
2021-2026
Journal (home page) visits
46561
Published articles
29
Article views
39787
Article downloads
7494
Downloads/article
258
APC
99.00