Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access December 16, 2022

A Framework for the Application of Optimization Techniques in the Achievement of Global Emission Targets in the Housing Sector

Abstract The building construction industry holds a crucial role in the reduction of greenhouse gas emissions globally. The targets for greenhouse gas emissions may not be achieved without a defined strategic plan to meet up with the set targets from various sectors of the economy. Recognizing the enormous potential that the building industry holds in contributing to global greenhouse gas GHG emission [...] Read more.
The building construction industry holds a crucial role in the reduction of greenhouse gas emissions globally. The targets for greenhouse gas emissions may not be achieved without a defined strategic plan to meet up with the set targets from various sectors of the economy. Recognizing the enormous potential that the building industry holds in contributing to global greenhouse gas GHG emission reduction, this study describes a framework on how optimization techniques can be used as a guide for emission reduction targets for the housing sector using illustrations of the onsite and offsite building construction industry. Given that some of the GHG gases are also sources of air pollution, this study includes a discussion on how the effort to address air pollution can be used to find a consensus towards addressing the concern about GHG emissions. This study presents procedures for simplified methods of estimation of GHG emissions that various municipalities around the globe can use to estimate and report the emissions from the building construction industry. The study presents a unifying strategy for emission management. The study also demonstrates how programming methods can be applied to GHG emissions management. The approach used in this study is transferable to other industries. The study recommends a unifying strategy for the management and control of emissions in the building construction industry. The study also recommends a coordinated effort in sharing best practices for emission control and management from all jurisdictions globally. In the effort to reduce global emission targets, further studies like this and its expansion is recommended for all sectors of the global economy. It is recommended that these studies should be followed by a concrete effort to achieve good implementation of sustainable emission reduction targets globally.
Figures
PreviousNext
Article
Open Access October 09, 2025

Simulation-Based Learning in Nursing Education: Perspectives of Student Nurses in the Philippines

Abstract Simulation-based learning (SBL) is widely recognized as an effective educational approach that bridges theory and practice in nursing education. Despite its global adoption, limited research has examined the experiences of Filipino nursing students with SBL, particularly in resource-constrained settings. This study explored the perspectives of Bachelor of Science in Nursing students from a [...] Read more.
Simulation-based learning (SBL) is widely recognized as an effective educational approach that bridges theory and practice in nursing education. Despite its global adoption, limited research has examined the experiences of Filipino nursing students with SBL, particularly in resource-constrained settings. This study explored the perspectives of Bachelor of Science in Nursing students from a university in Metro Manila, Philippines, on the impact of SBL on their skills, emotional responses, and challenges encountered. A descriptive qualitative design was employed using purposive sampling of ten students who had participated in at least one SBL activity. Data were collected through semi-structured interviews and short written reflections and analyzed thematically following Braun and Clarke’s framework to capture nuanced experiences. Three major themes emerged from the analysis. First, students reported initial anxiety, nervousness, and stress during their early SBL experiences, which gradually transformed into confidence, adaptability, and resilience as they gained familiarity and competence. Second, SBL enhanced technical and cognitive skills such as clinical judgment, decision-making, teamwork, and patient-centered care, supporting students’ readiness for real-world practice. Third, students identified resource limitations, insufficient equipment, and time constraints as significant barriers to optimal learning, though these challenges also fostered creativity and perseverance. The findings demonstrate that SBL fosters technical competence, critical thinking, and professional growth but requires institutional support to address resource constraints and faculty development needs. This study underscores the importance of expanding SBL in Philippine nursing curricula to align with international best practices and to contribute to Sustainable Development Goals 3 (good health and well-being), 4 (quality education), and 5 (gender equality).
Figures
PreviousNext
Article
Open Access September 14, 2025

Lifecycle Management as a Roadmap to the Tobacco Endgame

Abstract Background: Tobacco endgame, defined as elimination of commercial tobacco sales The U.S. tobacco control landscape is a complex, adaptive system shaped by diverse stakeholders, evolving products and regulations, shifting social norms, and the strategic countermeasures of a powerful industry. Managing such complexity requires more than isolated interventions—it demands a coordinated, [...] Read more.
Background: Tobacco endgame, defined as elimination of commercial tobacco sales The U.S. tobacco control landscape is a complex, adaptive system shaped by diverse stakeholders, evolving products and regulations, shifting social norms, and the strategic countermeasures of a powerful industry. Managing such complexity requires more than isolated interventions—it demands a coordinated, enterprise-wide approach that accounts for dynamic interactions, feedback loops, and emergent risks. Objective: Drawing on complex systems thinking, Zachman enterprise architecture model, and public health best practices, we conceptualize tobacco control as an evolving enterprise progressing through six interconnected phases: (1) Conception & Initiation, (2) Policy & System Design, (3) Implementation & Operation, (4) Evaluation & Adaptation, (5) Consolidation & Endgame Transition, and (6) Sustainment or Sunset. Each phase incorporates governance structures, performance benchmarks, and transition criteria designed to manage interdependence and reduce systemic vulnerabilities. Results: The lifecycle framing emphasizes how tobacco control in the U.S. can evolve as a complex, adaptive enterprise—integrating public health objectives with legal, operational, and cultural change processes. This model supports strategic sequencing, cross-sector alignment, and risk mitigation against emergent industry tactics, enabling a resilient and measurable pathway to the endgame. Conclusions: Seeing tobacco control as a complex enterprise that operates under a lifecycle model may offer a roadmap for achieving and sustaining the tobacco endgame. Using this approach may enhance policy coherence, resource efficiency, and adaptability, ensuring tobacco endgame is achieved.
Figures
PreviousNext
Article
Open Access April 29, 2024

Digital Forensic Investigation Standards in Cloud Computing

Abstract Digital forensics in cloud computing environments presents significant challenges due to the distributed nature of data storage, diverse security practices employed by service providers, and jurisdictional complexities. This study aims to develop a comprehensive framework and improved methodologies tailored for conducting digital forensic investigations in cloud settings. A pragmatic research [...] Read more.
Digital forensics in cloud computing environments presents significant challenges due to the distributed nature of data storage, diverse security practices employed by service providers, and jurisdictional complexities. This study aims to develop a comprehensive framework and improved methodologies tailored for conducting digital forensic investigations in cloud settings. A pragmatic research philosophy integrating positivist and interpretivist paradigms guides an exploratory sequential mixed methods design. Qualitative methods, including case studies, expert interviews, and document analysis were used to explore key variables and themes. Findings inform hypotheses and survey instrument development for the subsequent quantitative phase involving structured surveys with digital forensics professionals, cloud providers, and law enforcement agencies, across the globe. The multi-method approach employs purposive and stratified random sampling techniques, targeting a sample of 100-150 participants, across the globe, for qualitative components and 300-500 for quantitative surveys. Qualitative data went through thematic and content analysis, while quantitative data were analysed using descriptive and inferential statistical methods facilitated by software such as SPSS and R. An integrated mixed methods analysis synthesizes and triangulates findings, enhancing validity, reliability, and comprehensiveness. Strict ethical protocols safeguard participant confidentiality and data privacy throughout the research process. This robust methodology contributed to the development of improved frameworks, guidelines, and best practices for digital forensics investigations in cloud computing, addressing legal and jurisdictional complexities in this rapidly evolving domain.
Figures
PreviousNext
Article
Open Access October 15, 2022

Big Data and AI/ML in Threat Detection: A New Era of Cybersecurity

Abstract The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even [...] Read more.
The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even under pressure from regulatory boards, have strived to harness the power of data and leverage it to enhance safety and security, maximize performance, and mitigate risks. However, the adversaries themselves have capitalized on the unequal battle of big data and artificial intelligence to inflict widespread chaos. Therefore, the demand for big data analytics and AI/ML for high-fidelity intelligence, surveillance, and reconnaissance is at its highest. Today, in the cybersecurity realm, the detection of adverse incidents poses substantial challenges due to the sheer variety, volume, and velocity of deep packet inspection data. State-of-the-art detection techniques have fallen short of detecting the latest attacks after a big data breach incident. On the other hand, computational intelligence techniques such as machine learning have reignited the search for solutions for diverse monitoring problems. Recent advancements in AI/ML frameworks have the potential to analyze IoT/edge-generated big data in near real-time and assist risk assessment and mitigation through automated threat detection and modeling in the big data and AI/ML domain. Industry best practices and case studies are examined that endeavor to showcase how big data coupled with AI/ML unlocks new dimensions and capabilities in improved vigilance and monitoring, prediction of adverse incidents, intelligent modeling, and future uncertainty quantification by data resampling correction. All of these avenues lead to enhanced robustness, security, safety, and performance of industrial processes, computing, and infrastructures. A view of the future and how the potential threats due to the misuse of new technologies from bandwidth to IoT/edge, blockchain, AI, quantum, and autonomous fields is discussed. Cybersecurity is again playing out at a pace set by adversaries with low entry barriers and debilitating tools. The need for innovative solutions for defense from the emerging threat landscape, harnessing the power of new technologies and collaboration, is emphasized.
Figures
PreviousNext
Article
Open Access December 27, 2022

Survey of Automated Testing Frameworks and Tools for Software Quality Assurance: Challenges and Best Practices

Abstract Automated testing and software quality assurance (SQA) practices are essential for ensuring the reliability, scalability, and maintainability of modern software systems. This paper presents a review of widely used automated testing frameworks, including Driven, Data-Driven, Behavior-Driven Development (BDD), and Record/Playback approaches, outlining their methodologies, benefits, and limitations [...] Read more.
Automated testing and software quality assurance (SQA) practices are essential for ensuring the reliability, scalability, and maintainability of modern software systems. This paper presents a review of widely used automated testing frameworks, including Driven, Data-Driven, Behavior-Driven Development (BDD), and Record/Playback approaches, outlining their methodologies, benefits, and limitations in different development contexts. In parallel, it examines established SQA techniques such as Test-Driven Development, static analysis, and white-box testing, which provide systematic methods for defect detection and quality improvement. The study further examines the role of practical tools, such as Selenium, TestNG, and JUnit, in supporting test automation and validation activities. In addition to highlighting technical capabilities, the paper identifies common challenges faced in automation, including incomplete requirements, integration complexities, and maintaining evolving test suites. Recommended best practices are provided to address these issues, offering guidance for organizations seeking to strengthen their software testing processes through structured frameworks, adaptive techniques, and reliable automation tools.
Figures
PreviousNext
Article
Open Access December 27, 2021

Best Practices of CI/CD Adoption in Java Cloud Environments: A Review

Abstract The continuous integration (CI) and continuous delivery/deployment (CD) methods are key tools in the field of modern software development, and they assist in the rapid, reliable and quality delivery of software. These DevOps methods are automated, and the code development, testing, and deployment processes are streamlined, which reduces the risk of integration, enhances productivity, and minimizes [...] Read more.
The continuous integration (CI) and continuous delivery/deployment (CD) methods are key tools in the field of modern software development, and they assist in the rapid, reliable and quality delivery of software. These DevOps methods are automated, and the code development, testing, and deployment processes are streamlined, which reduces the risk of integration, enhances productivity, and minimizes human labor. To implement CI/CD, Java cloud applications can utilize cloud-native services such as AWS Code Pipeline, Azure DevOps, and Google Cloud Build, as well as tools like Jenkins, GitLab CI/CD, GitHub Actions, CircleCI, Travis CI, and Bamboo. Basic concepts of CI/CD include automation, regular integration, testing, intensive testing, constant feedback, and process improvement. Some of the major pipeline phases include deployment, monitoring, testing, artefact management, build automation, and source code management. Despite clear benefits, challenges remain, including infrastructure complexity, dependency management, test reliability, and cultural barriers, particularly in large-scale or enterprise Java projects. This work provides a thorough analysis of CI/CD procedures and resources, including frameworks, best practices, and challenges for Java cloud applications. It highlights strategies to optimize adoption, improve software quality, and accelerate delivery cycles.
Figures
PreviousNext
Review Article

Query parameters

Keyword:  Best Practices

View options

Citations of

Views of

Downloads of