Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access February 06, 2026

Predictive Modeling of Public Sentiment Using Social Media Data and Natural Language Processing Techniques

Abstract Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled [...] Read more.
Social media platforms like X (formerly Twitter) generate vast volumes of user-generated content that provide real-time insights into public sentiment. Despite the widespread use of traditional machine learning methods, their limitations in capturing contextual nuances in noisy social media text remain a challenge. This study leverages the Sentiment140 dataset, comprising 1.6 million labeled tweets, and develops predictive models for binary sentiment classification using Naive Bayes, Logistic Regression, and the transformer-based BERT model. Experiments were conducted on a balanced subset of 12,000 tweets after comprehensive NLP preprocessing. Evaluation using accuracy, F1-score, and confusion matrices revealed that BERT significantly outperforms traditional models, achieving an accuracy of 89.5% and an F1-score of 0.89 by effectively modeling contextual and semantic nuances. In contrast, Naive Bayes and Logistic Regression demonstrated reasonable but consistently lower performance. To support practical deployment, we introduce SentiFeel, an interactive tool enabling real-time sentiment analysis. While resource constraints limited the dataset size and training epochs, future work will explore full corpus utilization and the inclusion of neutral sentiment classes. These findings underscore the potential of transformer models for enhanced public opinion monitoring, marketing analytics, and policy forecasting.
Figures
PreviousNext
Article
Open Access January 10, 2025

Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence

Abstract Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a [...] Read more.
Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS's transformative potential across diverse computational fields.
Figures
PreviousNext
Article
Open Access September 13, 2023

A Comparative Study of Attention-Based Transformer Networks and Traditional Machine Learning Methods for Toxic Comments Classification

Abstract With the rapid growth of online communication platforms, the identification and management of toxic comments have become crucial in maintaining a healthy online environment. Various machine learning approaches have been employed to tackle this problem, ranging from traditional models to more recent attention-based transformer networks. This paper aims to compare the performance of attention-based [...] Read more.
With the rapid growth of online communication platforms, the identification and management of toxic comments have become crucial in maintaining a healthy online environment. Various machine learning approaches have been employed to tackle this problem, ranging from traditional models to more recent attention-based transformer networks. This paper aims to compare the performance of attention-based transformer networks with several traditional machine learning methods for toxic comments classification. We present an in-depth analysis and evaluation of these methods using a common benchmark dataset. The experimental results demonstrate the strengths and limitations of each approach, shedding light on the suitability and efficacy of attention-based transformers in this domain.
Article
Open Access August 30, 2023

Spin Structures and non-Relativistic Spin Operators

Abstract In Quantum Physics, the spin and angular momentum operators are magnitudes introduced by means of a vector transformation law. However, interpreting the eigenvalues of its Z "components" as projections on said axis leads to certain contradictions supposedly avoided by a mandatory (presented as a freely selected) Z's orientation. It is shown that an oriented physical space almost forces us to [...] Read more.
In Quantum Physics, the spin and angular momentum operators are magnitudes introduced by means of a vector transformation law. However, interpreting the eigenvalues of its Z "components" as projections on said axis leads to certain contradictions supposedly avoided by a mandatory (presented as a freely selected) Z's orientation. It is shown that an oriented physical space almost forces us to project the angular momentum's and spin's eigenvalues onto its orientation's 3-form, which sidesteps entering into inconsistencies. The final conclusion is that this "rare" magnitude called spin, downright naturally comes in and plays thanks to the orientation of our three-dimensional space.
Communication
Open Access November 08, 2022

The c-equivalence principle and its implications for physics

Abstract The c-equivalence principle, commonly accepted as true by most physicists, is the unstated assumption that equals the kinematic speed of light. Should someone prove the principle false, it would render the composition of two Lorentz transformations meaningless. The second hypothesis of the Special Theory of Relativity in its strong form would also be invalidated. This paper examined some of the [...] Read more.
The c-equivalence principle, commonly accepted as true by most physicists, is the unstated assumption that equals the kinematic speed of light. Should someone prove the principle false, it would render the composition of two Lorentz transformations meaningless. The second hypothesis of the Special Theory of Relativity in its strong form would also be invalidated. This paper examined some of the consequences for physics, should this principle be proven false and outline some experiments to determine light speed, which could falsify the principle and provide evidence for the ether.
Figures
PreviousNext
Review Article
Open Access May 20, 2021

Bioconcentration Factor of Polychlorinated Biphenyls and Its Correlation with UV- and IR-Spectroscopic data: A DFT based Study

Abstract Polychlorinated biphenyls (PCBs) are important class of persist organic pollutants that were used as a component of paints especially in printings, as plastificator of plastics and insulating materials in transformers and capacitors, heat transfer fluids, additives in hydraulic fluids in vacuum and turbine pumps. There is always a need to establish reliable procedures for predicting the [...] Read more.
Polychlorinated biphenyls (PCBs) are important class of persist organic pollutants that were used as a component of paints especially in printings, as plastificator of plastics and insulating materials in transformers and capacitors, heat transfer fluids, additives in hydraulic fluids in vacuum and turbine pumps. There is always a need to establish reliable procedures for predicting the bioconcentration potential of chemicals from the knowledge of their molecular structure, or from readily measurable properties of the substance. Hence, correlation and prediction of biococentration factors (BCFs) based on λmax and vibration frequencies of various bonds viz υ(C-H) and υ(C=C) of biphenyl and its fifty-seven derivatives have been made. For the study, the molecular modeling and geometry optimization of the PCBs have been performed on workspace program of CAChe Pro 5.04 software of Fujitsu using DFT method. UV-visible spectra for each compound were created by electron transition between molecular orbitals as electromagnetic radiation in the visible and ultraviolet (UV-visible) region is absorbed by the molecule. The energies of excited electronic states were computed quantum mechanically. IR spectra of transitions for each compound were created by coordinated motions of the atoms as electromagnetic radiation in the infrared region is absorbed by the molecule. The force necessary to distort the molecule was computed quantum mechanically from its equilibrium geometry and thus frequency of vibrational transitions was predicted. Project Leader Program associated with CAChe has been used for multiple linear regression (MLR) analysis using above spectroscopic data as independent variables and BCFs of PCBs as dependent variables. The reliability of correlation and predicting ability of the MLR equations (models) are judged by R2, R2adj, se, q2L10O and F values. This study reflected clearly that UV and IR spectroscopic data can be used to predict BCFs of a large number of related compounds within limited time without any difficulty.
Figures
PreviousNext
Editorial Article
Open Access June 13, 2021

When we put spatial causalities first in production of scientific knowledge: notes on the geography of science

Abstract Any history of science has its own geography as well. Geographers of science have tried to put science in its place. They study the socio-spatial settings in which scientific knowledge was generated, displayed and legitimated. For them, science is socially constructed in spatialities and temporalities. The main question should to be “how” spatialities are constructing scientific knowledge via its [...] Read more.
Any history of science has its own geography as well. Geographers of science have tried to put science in its place. They study the socio-spatial settings in which scientific knowledge was generated, displayed and legitimated. For them, science is socially constructed in spatialities and temporalities. The main question should to be “how” spatialities are constructing scientific knowledge via its “causalities”. Geography of science is not just about special places, locations, and regions in which scientific knowledge is unequally produced/consumed and circulated or how the use of scientific knowledge can lead to the production and reproduction of unique places and spaces. Geography of science is also about a variety set of spatial causalities through which scientific knowledge can be formed and transformed. This also means that the innovative knowledge or ideas development takes place not only in the spatial contexts but because of the spatial causalities which rise from the myriad interlinkages and interdependencies among places. These imperatives of spatial significance operate across many spatial scales from the body to the global. Hence, in our increasingly glocalized world, we must seek knowledge in spatial encounters and betweenness of places, not merely within spaces and places.
Short Note
Open Access January 13, 2026

Principles and Practices of Transformative Online Doctoral Mentoring—A Mentor’s Perspective

Abstract An effective mentor is critical to the success of an online doctoral student. Researchers have found that online doctoral students prefer frequent interactions with their mentor, while faculty prefer mentees to be autonomous. Transformative online doctoral mentoring (ODM) requires the development of a strong collaborative working relationship between the mentee and mentor, who serves as the link [...] Read more.
An effective mentor is critical to the success of an online doctoral student. Researchers have found that online doctoral students prefer frequent interactions with their mentor, while faculty prefer mentees to be autonomous. Transformative online doctoral mentoring (ODM) requires the development of a strong collaborative working relationship between the mentee and mentor, who serves as the link between the student and academia, as well as their guide and working partner throughout the dissertation process. In this paper, I argue that the ultimate objective of ODM, the establishment of such a relation-ship between mentor and mentee, increases the likelihood of student success. I support this contention with a set of principles and practices grounded in relevant models and methods of human development, participative leadership, and collaborative change management that provide insights into the what, why, and how of transformative ODM.
Article
Open Access October 20, 2025

From Subordination to Empowerment: The Journey of Yi Women in Daliangshan

Abstract This paper examines the transformation of Yi women’s social status in Daliangshan, Sichuan Province. It analyzes historical practices—including child marriage (wawaqin [...] Read more.
This paper examines the transformation of Yi women’s social status in Daliangshan, Sichuan Province. It analyzes historical practices—including child marriage (wawaqin) and the tradition of high bridal gifts—along with the role of education, economic modernization, and cultural advocacy initiatives. The study situates these developments within the framework of the United Nations Sustainable Development Goals (SDGs), focusing on gender equality, poverty alleviation, and equitable development. Field interviews, observations, and community-based projects inform this analysis, which highlights both progress and persisting challenges for Yi women.
Figures
PreviousNext
Article
Open Access October 09, 2025

Simulation-Based Learning in Nursing Education: Perspectives of Student Nurses in the Philippines

Abstract Simulation-based learning (SBL) is widely recognized as an effective educational approach that bridges theory and practice in nursing education. Despite its global adoption, limited research has examined the experiences of Filipino nursing students with SBL, particularly in resource-constrained settings. This study explored the perspectives of Bachelor of Science in Nursing students from a [...] Read more.
Simulation-based learning (SBL) is widely recognized as an effective educational approach that bridges theory and practice in nursing education. Despite its global adoption, limited research has examined the experiences of Filipino nursing students with SBL, particularly in resource-constrained settings. This study explored the perspectives of Bachelor of Science in Nursing students from a university in Metro Manila, Philippines, on the impact of SBL on their skills, emotional responses, and challenges encountered. A descriptive qualitative design was employed using purposive sampling of ten students who had participated in at least one SBL activity. Data were collected through semi-structured interviews and short written reflections and analyzed thematically following Braun and Clarke’s framework to capture nuanced experiences. Three major themes emerged from the analysis. First, students reported initial anxiety, nervousness, and stress during their early SBL experiences, which gradually transformed into confidence, adaptability, and resilience as they gained familiarity and competence. Second, SBL enhanced technical and cognitive skills such as clinical judgment, decision-making, teamwork, and patient-centered care, supporting students’ readiness for real-world practice. Third, students identified resource limitations, insufficient equipment, and time constraints as significant barriers to optimal learning, though these challenges also fostered creativity and perseverance. The findings demonstrate that SBL fosters technical competence, critical thinking, and professional growth but requires institutional support to address resource constraints and faculty development needs. This study underscores the importance of expanding SBL in Philippine nursing curricula to align with international best practices and to contribute to Sustainable Development Goals 3 (good health and well-being), 4 (quality education), and 5 (gender equality).
Figures
PreviousNext
Article
Open Access June 03, 2025

Complexity Leadership Theory Integration into Nursing Leadership and Development in Addressing COVID-19 and Future Pandemics

Abstract Complexity Leadership Theory (CLT) is a new and revolutionary concept in addressing healthcare crises worldwide. Its relevance and applications were tested during the COVID-19 pandemic. However, no definite and encompassing research was done to apply it to nursing leadership. Thus, this study examines CLT integration into nursing leadership to address the challenges posed by the pandemic. Through [...] Read more.
Complexity Leadership Theory (CLT) is a new and revolutionary concept in addressing healthcare crises worldwide. Its relevance and applications were tested during the COVID-19 pandemic. However, no definite and encompassing research was done to apply it to nursing leadership. Thus, this study examines CLT integration into nursing leadership to address the challenges posed by the pandemic. Through a systematic review of literature from PubMed, Scopus, and Web of Science, relevant studies were analyzed to determine how complexity leadership theory was defined, conceptualized, and operationalized within nursing leadership context. The findings reveal that traditional hierarchical leadership models are insufficient in a dynamic crisis environment like the pandemic. Instead, CLT’s framework which encompasses adaptive, administrative, and enabling leadership facilitates innovation, resilience, and effective interprofessional collaboration. Nurse leaders employing these strategies are better positioned to manage resources limitation, foster shared decision-making, and implement technological advancements in rapidly changing healthcare settings. Overall, this study underscores the potential of complexity leadership theory to transform nursing leadership practices by promoting continuous learning and empowerment, thereby enhancing crisis response and preparedness for future pandemics.
Systematic Review
Open Access April 10, 2025

Advancements in Pharmaceutical IT: Transforming the Industry with ERP Systems

Abstract The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data [...] Read more.
The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data integration, contributing significantly to operational efficiency and organizational agility. This paper explores the evolution and impact of ERP systems within the pharmaceutical sector, highlighting their contributions to overcoming the industry’s inherent challenges, including complex regulatory requirements, the need for accurate and real-time data, and the demand for supply chain resilience. The integration of cloud-based ERP solutions, the incorporation of emerging technologies like Artificial Intelligence (AI), Machine Learning (ML), and the Internet of Things (IoT), and enhanced data analytics capabilities have revolutionized pharmaceutical IT. These advancements not only reduce operational costs, improve forecasting accuracy, and enhance collaboration but also ensure compliance with stringent global regulations, such as Good Manufacturing Practices (GMP) and FDA guidelines. Moreover, ERP systems have been instrumental in managing the pharmaceutical supply chain, ensuring product traceability, and improving inventory control and order fulfillment processes. This manuscript examines how ERP systems enable pharmaceutical companies to maintain high standards of product quality, improve decision-making, and ensure the safety and efficacy of drugs through robust tracking and auditing mechanisms. A case study of a pharmaceutical company that implemented an ERP system demonstrates the tangible benefits, including increased operational efficiency, improved compliance rates, and enhanced customer satisfaction. However, despite the clear advantages, challenges such as customization complexities, data integration issues, and resistance to change remain. As the pharmaceutical industry continues to evolve, ERP systems will remain a cornerstone of digital transformation, facilitating smarter decision-making, better resource management, and enhanced collaboration across global operations. This paper also identifies future trends, including the potential of AI and blockchain technologies in further strengthening ERP systems and transforming the pharmaceutical landscape.
Review Article
Open Access January 22, 2025

Tech Transformations: Modern Solutions for Obstructive Sleep Apnea

Abstract Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in [...] Read more.
Recent advancements in the screening, diagnosis, and management of obstructive sleep apnea (OSA) have significantly improved patient outcomes. For screening, the use of home sleep apnea testing (HSAT) has become more prevalent, offering a convenient and cost-effective alternative to traditional in-lab polysomnography. HSAT devices have shown good specificity and sensitivity, particularly in patients with a high pre-test probability of OSA. In terms of diagnosis, advancements in wearable technology and mobile health applications have enabled continuous monitoring of sleep patterns and respiratory parameters. These tools provide valuable data that can be used to identify OSA more accurately and promptly. Additionally, machine learning algorithms are being integrated into diagnostic processes to enhance the accuracy of OSA detection by analyzing large datasets and identifying patterns indicative of the condition. Management of OSA has also seen significant progress. Continuous positive airway pressure (CPAP) therapy remains the gold standard, but new developments include auto-adjusting CPAP devices that optimize pressure settings based on real-time feedback. Mandibular advancement devices and hypoglossal nerve stimulation are emerging as effective alternatives for patients who are CPAP-intolerant. Furthermore, lifestyle interventions such as weight management, positional therapy, and exercise have been shown to complement medical treatments, leading to better overall outcomes. This review article highlights these advancements that collectively contribute to improved patient adherence, reduced symptoms, and enhanced quality of life for individuals with OSA.
Review Article
Open Access January 20, 2025

Deep Learning-Based Sentiment Analysis: Enhancing IMDb Review Classification with LSTM Models

Abstract Sentiment analysis, a vital aspect of natural language processing, involves the application of machine learning models to discern the emotional tone conveyed in textual data. The use case for this type of problem is where businesses can make informed decisions based on customer feedback, identify the sentiments of their employees, and make decisions on hiring or retention, or for that matter, [...] Read more.
Sentiment analysis, a vital aspect of natural language processing, involves the application of machine learning models to discern the emotional tone conveyed in textual data. The use case for this type of problem is where businesses can make informed decisions based on customer feedback, identify the sentiments of their employees, and make decisions on hiring or retention, or for that matter, classify a text based on its topic like whether it is about a particular subject like physics or chemistry as is useful in search engines. The model leverages a sequential architecture, transforms words into dense vectors using an Embedding layer, and captures intricate sequential patterns with two Long Short-Term Memory (LSTM) layers. This model aims to effectively classify sentiments in text data using a 50-dimensional embedding dimension and 20 % dropout layers. The use of rectified linear unit (ReLU) activations enhances non-linearity, while the SoftMax activation in the output layer aligns with the multi-class nature of sentiment analysis. Both training and test accuracy were well over 80%.
Figures
PreviousNext
Article
Open Access January 09, 2025

Advances in the Synthesis and Optimization of Pharmaceutical APIs: Trends and Techniques

Abstract The synthesis and optimization of Active Pharmaceutical Ingredients (APIs) is fundamental to pharmaceutical drug development, directly influencing drug efficacy, safety, and cost-effectiveness. Over recent years, significant advancements in synthetic methodologies and manufacturing technologies have transformed API production. This manuscript provides an overview of the latest innovations in API [...] Read more.
The synthesis and optimization of Active Pharmaceutical Ingredients (APIs) is fundamental to pharmaceutical drug development, directly influencing drug efficacy, safety, and cost-effectiveness. Over recent years, significant advancements in synthetic methodologies and manufacturing technologies have transformed API production. This manuscript provides an overview of the latest innovations in API synthesis, focusing on key techniques such as green chemistry, continuous flow chemistry, biocatalysis, and automation. Green chemistry principles, including solvent substitution and catalytic reactions, have enhanced sustainability by reducing waste and energy consumption. Continuous flow chemistry offers improved reaction control, scalability, and safety, while biocatalysis provides an eco-friendly alternative for synthesizing complex and chiral APIs. Additionally, the integration of automation and advanced process control using machine learning and real-time monitoring has optimized production efficiency and consistency. The manuscript also discusses the challenges associated with regulatory compliance and quality assurance, highlighting the role of advanced analytical techniques such as HPLC, NMR, and mass spectrometry in ensuring API purity. Looking ahead, personalized medicine and smart manufacturing technologies, including blockchain for traceability, are expected to drive further innovation in API production. This review concludes by emphasizing the need for continued advancements in sustainability, efficiency, and scalability to meet the evolving demands of the pharmaceutical industry, ultimately enabling the development of safer, more effective, and environmentally responsible medicines.
Review Article
Open Access November 16, 2024

Digital Therapeutics: A New Dimension to Diabetes Mellitus Management

Abstract Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle [...] Read more.
Digital therapeutics (DTx) play a transformative role in diabetes management by leveraging technology to provide personalized, data-driven medical interventions. These tools enhance self-management by offering continuous monitoring and real-time feedback on glucose levels, diet, and physical activity. This personalized approach helps patients adhere to treatment plans and make informed lifestyle changes, leading to improved clinical outcomes such as reduced HbA1c levels and better overall diabetes control. The importance of DTx lies in their ability to make diabetes care more accessible and convenient. Mobile apps and telemedicine platforms enable patients to receive support and guidance from anywhere, reducing the need for frequent in-person visits. Additionally, DTx often include behavioral support features like reminders, educational content, and motivational tools, which are crucial for maintaining healthy habits and managing stress. Currently, the dynamics of DTx in diabetes are rapidly evolving, with increasing integration of artificial intelligence and machine learning to further personalize and optimize care. As the adoption of these technologies grows, they hold the potential to significantly improve patient outcomes and revolutionize diabetes management on a global scale. This article will focus on the benefits of novel digital therapeutics for prevention and management of type II diabetes that are currently available in the market.
Figures
PreviousNext
Article
Open Access August 30, 2024

Exploring the Benefits of Forgiveness among Adolescents in Junior High Schools in Bimbilla in Ghana: A Comparative Study Based on Age

Abstract This study investigates the benefits of forgiveness among adolescents in Junior High Schools (JHS) in Bimbilla, Ghana, focusing on the influence of age on the effectiveness of forgiveness interventions. The study adopted a mixed-method experimental design, a purposive selection of eight JHSs within the Nanumba North Municipality, from which 60 adolescents were randomly chosen to participate. The [...] Read more.
This study investigates the benefits of forgiveness among adolescents in Junior High Schools (JHS) in Bimbilla, Ghana, focusing on the influence of age on the effectiveness of forgiveness interventions. The study adopted a mixed-method experimental design, a purposive selection of eight JHSs within the Nanumba North Municipality, from which 60 adolescents were randomly chosen to participate. The study employed the Enright Forgiveness Inventory, Depression Mood Scale, and Anger Self-Report items to assess participants' emotional states before and after the intervention. The interventions were structured around the REACH model of forgiveness, which included sessions aimed at helping participants identify sources of hurt, understand the concept of forgiveness, and recognise the emotional costs of holding onto grievances. Qualitative data were analysed into themes using an interpretative lens. A two-way Analysis of Covariance (ANCOVA) was used to analyse the data. The findings revealed that exposure to forgiveness therapies significantly reshaped participants' negative emotions, leading to a marked decrease in feelings of anger and depression. Post-intervention assessments indicated that participants developed a more positive outlook towards their offenders, highlighting the transformative power of forgiveness in fostering emotional well-being. The study's results align with previous research, indicating that forgiveness interventions can effectively reduce negative emotional states and promote psychological resilience. The implications of these findings suggest that integrating forgiveness education into school curricula could be beneficial for enhancing the mental health of adolescents. By fostering an environment that encourages forgiveness, educators and mental health professionals can help mitigate the adverse effects of unresolved emotional conflicts, ultimately contributing to healthier interpersonal relationships and improved overall well-being among young individuals.
Figures
PreviousNext
Article
Open Access July 21, 2024

From Designed Object to Designed Context: Changes in Environmental Discourse in the First Twenty Years of the International Design Conference in Aspen

Abstract Through an in-depth discussion of the International Design Conference in Aspen from 1951 to 1970, this paper explores how environmental discourse underwent a shift in its connotations in the field of design during the conference. Of particular importance in this process of discursive transformation was the 1970 conference. This year's conference erupted into a conflict over the connotations of [...] Read more.
Through an in-depth discussion of the International Design Conference in Aspen from 1951 to 1970, this paper explores how environmental discourse underwent a shift in its connotations in the field of design during the conference. Of particular importance in this process of discursive transformation was the 1970 conference. This year's conference erupted into a conflict over the connotations of environmental discourse as environmental discourse outside of design impacted on and transformed the environmental discourse within design. This article examines the different concepts of the term 'environment', as presented by speakers and participants at the International Design Conferences in Aspen from 1951 to 1970, and especially focuses on the debates surrounding 'environment' at the 1970 conference. The article concludes by exploring the implications of this event and summarises the role of the 1970 International Design Conference in Aspen at this crucial turning point in environmental discourse. The aim is to explain and strengthen the significance of discourse a design conferences in the history of design, and to explore a new direction of design history research.
Figures
PreviousNext
Article
Open Access May 05, 2024

Challenges facing the Church in dealing with Moral Issues in Ghana: the way forward

Abstract The purpose of this study was to examine challenges facing the Church in dealing with Moral Issues and the way forward in Ghana. Qualitatively, the study sought to examine the Church's challenges in coping with Moral Issues and the way forward in Ghana. The study adopted a case study research design. The population of the study comprised leaders of Calvary Baptist Church – Adabraka and Shiashe. [...] Read more.
The purpose of this study was to examine challenges facing the Church in dealing with Moral Issues and the way forward in Ghana. Qualitatively, the study sought to examine the Church's challenges in coping with Moral Issues and the way forward in Ghana. The study adopted a case study research design. The population of the study comprised leaders of Calvary Baptist Church – Adabraka and Shiashe. These include the Vice President of the Ghana Baptist Convention and departmental heads at the Ghana Baptist Convention headquarters. Others included the Senior Pastor of Calvary Baptist Church – Adabraka with its satellite mission at Shiashe as well as a cross-section of pastors of these churches; the church administrator; the past and present directors of Baptist Relief and Development Agency (BREDA). The purposive sampling technique was specifically used to locate respondents for the study. The churches and participants were chosen because of their efforts in dealing with the causes of immorality confronting Ghanaian society. The main tool for data collection was a semi-structured interview guide. The data gathered was organised and analysed manually using emerging themes. The study revealed that the challenges which the Baptist Church encounters in its effort to deal with moral issues are the politicisation of statements made by the clergy, and inadequate trained personnel who are willing and ready to champion the agenda of the church in that respect. Financial difficulties were also mentioned. In this regard, specific reference was made to the effort made by the Ghana Baptist Convention to free the Trokosi girls. It was indicated that it takes a lot of financial resources to train and settle the freed girls. Regarding the way forward as far as these challenges were concerned, it was suggested that the church ought to speak more and do what it is mandated by Christ to do to bring about transformation. It is recommended that Churches should seriously intensify education on what constitutes human rights and freedom so that there would be a clear understanding of the concept that enables people to think through and adopt the good aspects to enhance their circumstances. Human rights defenders should exercise restraints when it comes to practices which are alien to Ghanaian values, laws and religious faith.
Review Article
Open Access February 19, 2024

The use of contemporary Enterprise Resource Planning (ERP) technologies for digital transformation

Abstract Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with [...] Read more.
Our lives are becoming more and more digital, and this has an impact on how we work, study, communicate, and interact. Businesses are currently digitally altering their information systems, procedures, culture, and strategy. Existing businesses and economies are severely disrupted by the digital revolution. The Internet of Things, microservices, and mobile services are examples of IT systems with numerous, dispersed, and very small structures that are made possible by digitization. Utilizing the possibilities of cloud computing, mobile systems, big data and analytics, services computing, Internet of Things, collaborative networks, and decision support, numerous new business prospects have emerged throughout the years. The logical basis for robust and self-optimizing run-time environments for intelligent business services and adaptable distributed information systems with service-oriented enterprise architectures comes from biological metaphors of living, dynamic ecosystems. This has a significant effect on how digital services and products are designed from a value- and service-oriented perspective. The evolution of enterprise architectures and the shift from a closed-world modeling environment to a more flexible open-world composition establish the dynamic framework for highly distributed and adaptive systems, which are crucial for enabling the digital transformation. This study examines how enterprise architecture has changed over time, taking into account newly established, value-based relationships between digital business models, digital strategies, and enhanced enterprise architecture.
Review Article
Open Access December 03, 2023

Evolution of Enterprise Applications through Emerging Technologies

Abstract The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various [...] Read more.
The extensive globalization of services and rapid technological advancements driven by IT have heightened the competitiveness of organizations in introducing innovative products and services. Among the noteworthy innovations is enterprise resource planning (ERP). An integral field in computer science, known as artificial intelligence (AI), is undergoing a transformative integration into various industries. Grasping the concept of artificial intelligence and its application in diverse business applications is crucial, given its broad and intricate nature. The primary focus of this paper is to delve into the realm of artificial intelligence and its utilization within enterprise resource planning. The study not only explores artificial intelligence but also delves into related concepts such as machine learning, deep learning, and neural networks in greater detail. Drawing upon existing literature, this research examines various books and online resources discussing the intersection of artificial intelligence and ERP. The findings reveal that the impact of AI is evident as businesses attain heightened levels of analytical efficiency across different ERP domains, thanks to remarkable advancements in AI, machine learning, and deep learning. Artificial intelligence is extensively employed in numerous ERP areas, with a particular emphasis on customer support, predictive analysis, operational planning, and sales projections.
Review Article
Open Access November 01, 2023

Role of Enterprise Applications for Pharmaceutical Drug Traceability

Abstract The role of enterprise applications in pharmaceutical industries is driving the digital transformation of various critical processes, and one process benefiting from this innovation is pharmaceutical drug traceability. This industry grapples with challenges like a lack of transparency, difficulties in tracking products, a deficit of trust, and issues related to shipping expired products. To [...] Read more.
The role of enterprise applications in pharmaceutical industries is driving the digital transformation of various critical processes, and one process benefiting from this innovation is pharmaceutical drug traceability. This industry grapples with challenges like a lack of transparency, difficulties in tracking products, a deficit of trust, and issues related to shipping expired products. To address these concerns, blockchain technology as an enterprise application has been harnessed as a solution. Notably, counterfeit drug prevention emerged as the most prevalent category, aligning with the pharmaceutical industry's primary objective. Blockchain technology is an emerging innovation that is finding enterprise applications in various industries, including healthcare. In the healthcare sector, Blockchain networks are being utilized to securely store and exchange patient data across hospitals, diagnostic laboratories, pharmacies, and medical practitioners. These enterprise applications can effectively identify and mitigate critical errors, including potentially hazardous ones within the realm of healthcare. Consequently, this enterprise technology holds the promise of enhancing the efficiency, security, and transparency of medical data sharing within the healthcare system. Moreover, it offers valuable tools for medical institutions to gain insights and improve the analysis of medical records. It visually represents the diverse capabilities, enablers, and the unified workflow process of Blockchain technology in supporting healthcare on a global scale. Additionally, the paper presents a thorough discussion of fourteen significant applications of Blockchain in healthcare, underscoring its pivotal role in addressing issues like deception in clinical trials.
Review Article
Open Access December 23, 2022

Climate Change's Impact on Agriculture and Food Security: An Opportunity to Showcase African Animal Genetic Resources

Abstract One of the current issues facing humanity is ensuring sustained global food security in the face of devastating effects of climate change; this challenge is particularly pressing on the African continent. Here, I present an opinion piece identifying local animal genetic resources as "African leverage point" that provide the highest chances to cushion rural fork to climate change, enhance [...] Read more.
One of the current issues facing humanity is ensuring sustained global food security in the face of devastating effects of climate change; this challenge is particularly pressing on the African continent. Here, I present an opinion piece identifying local animal genetic resources as "African leverage point" that provide the highest chances to cushion rural fork to climate change, enhance environmental sustainability and food security in Africa. When it comes to boosting food production, coping with climate change, or bolstering the delivery of a wide range of ecosystem services, I believe that African animal genetic resources are essential alternatives for the sustainable growth of the livestock industry and its contribution to food security. Africa needs to address the support and development of indigenous animal genetic resources in order to meet the basic food needs of more than 1 billion people, address numerous environmental issues with continental implications, and focus on more effective and resilient food systems with the greatest impact on food security. The indigenous animal resources diversity and support actions to this unique group could provide a boost in protein that is lacking to constitute healthy diets in Africa. The priorities of nonprofit organizations, foundations, governments, citizens' groups, and companies can be influenced by this leverage point in the African food system. Due to continuous food insecurity, which appears to be becoming worse with climate change and makes it even harder to accomplish the SDGs on the continent, Africa has paid a hefty price for being misled about the worth of its own animal genetic resources. To the contrary, it is highly improbable that a strategy to improve food security and rural livelihoods that undermines the utilization of indigenous animal genetic resources will be viable in long-term. If Africa makes an effort, is committed, and fully commits resources to putting indigenous animal genetic resources at the forefront of combating food insecurity and accelerating the achievement of SDGs, it can achieve more under the adverse prevailing climate change induce environmental conditions. Our personal opinion is that we would not have had the ongoing food problems, even in the face of climate change, if Africa had over the years implemented the necessary mechanisms to develop and promote local animal genetic resources. What lies ahead in terms of climate change effect on food security in Africa is anyone's guess – but whatever it is, promoting continental adapted indigenous animal genetic resources portfolio is ready to handle it. Development and promotion of African animal genetic resources should be part of a continental strategy to transform smallholder animal production by 2050, in line with the goals of achieving the SGDs, to improve rural household food security, and bringing rural economy prosperity, resilience, sustainability, and all other desired animal related food outcomes for rural healthy diets. African animal genetic resources are the most important but underutilized resource to address the issue of ongoing food insecurity. The responsible use of local animal genetic resources through climate smart animal husbandry practices also contributes to food security, rural development and increased employment opportunities. African genetic improvement programs involving indigenous animal genetic resources must be considered as regards to local agriculture and livestock development aspirations, appropriateness to local reality and livelihood security, as well as environmental friendliness. Animal agriculture will fill in the enormous gaps in the continent's food supply if this animal group receives adequate attention and is used integrated properly in crop and livestock systems which characterize smallholder farming sector in Africa. Because they have evolved over time to accommodate the various climatic conditions and environmental pressures on the continent, Africa's native animal genetic resources are particularly resilient. Indirectly, the impact of climate change offers a chance to use native animal genetics from Africa. The use of local animal genetic diversity has the potential to substantially improve Africa's food security landscape hence should be given special consideration for sociocultural, environmental, and economic aspects, and with regard for smallholder farmer-specific factors of interest. African animal genetic resources have contributed significantly to the food and nutrition security aspects of the millions of people in their communities of origin and custody in Africa. The purpose of the perception piece is to educate the reader about the fundamental mechanisms that control the use of continental animal genetic resources and how the outlook for these mechanisms can be manipulated in the future for the benefit of improving food security in Africa. The discussion provides in-depth insight into the pertinent literature in understanding the significance of local animal genetic resources in terms of their contribution to food security in Africa.
Perspective
Open Access July 23, 2022

Peer-To-Peer Lending in US and China: A Guide for Emerging Market Countries

Abstract In mid 2000s, a new Fintech era has commenced which is known as “Crowd lending” or “FinTech Credit” whereby credit activities are realized online through internet platforms that match borrowers with lenders (investors). Those kinds of lending activities are named Peer to Peer Lending (P2P). The purpose of this study to elaborate the functioning and regulatory framework of P2P lending in US and [...] Read more.
In mid 2000s, a new Fintech era has commenced which is known as “Crowd lending” or “FinTech Credit” whereby credit activities are realized online through internet platforms that match borrowers with lenders (investors). Those kinds of lending activities are named Peer to Peer Lending (P2P). The purpose of this study to elaborate the functioning and regulatory framework of P2P lending in US and China. Those two countries can be considered as two conspicuous example of the application of P2P lending especially in terms of regulation. China transformed its P2P market in 2015 after a long loose regulation period and US from the very beginning applied a strict regulation on the market. By that way, a set of terms of regulation is aimed to be proposed especially for the emerging market countries. It is thought that P2P lending can contribute to the economic development of the emerging market countries if it is applied properly. The contribution of this study to newly developing literature is to provide a comparison and also a set of terms of regulation to be applied in the emerging market countries.
Figures
PreviousNext
Article
Open Access July 04, 2022

An appraisal of Social Studies Teachers Perceptions of Teachers’ Pedagogical Content Knowledge

Abstract The study objective was to assess the perceptions of Junior High School (JHS) Social Studies teachers in the Yilo Krobo Municipality of the Eastern Region of Ghana on teachers’ Pedagogical Content Knowledge (PCK). The study adopted Shulman's theory of Pedagogical Content Knowledge (PCK) as its theoretical framework. The philosophical approach upon which the study is hinged on is the ideology of [...] Read more.
The study objective was to assess the perceptions of Junior High School (JHS) Social Studies teachers in the Yilo Krobo Municipality of the Eastern Region of Ghana on teachers’ Pedagogical Content Knowledge (PCK). The study adopted Shulman's theory of Pedagogical Content Knowledge (PCK) as its theoretical framework. The philosophical approach upon which the study is hinged on is the ideology of interpretivism and positivism, in other words, pragmatism. The study used a mixed methodological approach as well as a descriptive survey design. A random sampling technique was used for the study. The study participants were JHS social studies teachers in Yilo-Krobo Municipality, Ghana. Eighty (80) out of the one hundred and two (102) representing 78.43% JHS Social Studies teachers were selected from the fifty-four JHSs in the Municipality. Both Questionnaire and interview guide were used for data collection. The survey data was analyzed using descriptive statistics and the interview data was analyzed using content analysis. The study indicated that at the heart of the PCK concept is the idea that 'deep knowledge' of content is essential for effective teaching and cannot be taken for granted; that it has a significant bearing on teaching and student learning, and that it is used as a cadre to define professional teaching knowledge. PCK also provides the uniquely necessary knowledge for the transformation of the different types of knowledge required for Social Studies teaching and evolves over time due to the progressive awareness of students' needs, while a wealth of content knowledge is imperative for the development of a comprehensive pedagogical content knowledge. The paper recommends that the Ghana Education Service (GES) should conduct regular in-service training for teachers on the enhancement of their PCK, to enable them select appropriate TLMs and pedagogical approaches that foster meaningful learning for students.
Figures
PreviousNext
Article
Open Access May 15, 2022

Kinetic, Equilibrium and Thermodynamic Study of the Adsorption of Pb (II) and Cd (II) Ions from Aqueous Solution by the Leaves Biomass of Guava and Cashew Plants

Abstract The plant leaves used as adsorbent in this study were Guava plant leaves (GPL) and Cashew plant leaves (CPL). The samples were collected within Gombe State. Batch adsorption method was used in determining the adsorption process. Fourier Transform Spectroscopy (FT-IR), Scan-ning Electron Microscopy (SEM) and X-Ray Diffraction (XRD) were used for the characterization. The results show promising [...] Read more.
The plant leaves used as adsorbent in this study were Guava plant leaves (GPL) and Cashew plant leaves (CPL). The samples were collected within Gombe State. Batch adsorption method was used in determining the adsorption process. Fourier Transform Spectroscopy (FT-IR), Scan-ning Electron Microscopy (SEM) and X-Ray Diffraction (XRD) were used for the characterization. The results show promising signs as they were in agreement with most literatures; various per-centage removals were obtained from Pb2+ and Cd2+ (GPL and CPL) at optimum conditions. The equilibrium data fitted well with both Langmuir and Freundlich isotherm models. Langmuir mod-el fitted well for Pb2+ (CPL) with R2 value (0.9855) and Cd2+ for (GPL and CPL) with R2 values (0.9945 and 0.9948) while Pb2+ (GPL) with correlation coefficient at 0.9116 best fits well with Freundlich isotherm model. Pseudo first order and second order were used in testing the kinetics study from which pseudo second order best fitted better than that of the first order kinetics. The thermodynamic study shows that ΔG is negative in most cases except for Cd2+ (GPL) where ΔG is positive. Whereas ΔH and ΔS are positive in some cases showing an endothermic and spontane-ous adsorption processes respectively, as well as negative in some. Based on this study, GPL and CPL could be used as a natural adsorbent to remove Pb2+ and Cd2+ heavy metals from wastewater and environment due to their high removal efficiencies.
Figures
PreviousNext
Article
Open Access November 16, 2021

Determination of Deflection of the Vertical Components: Implications on Terrestrial Geodetic Measurement

Abstract The deflection of the vertical is an important parameter that combines both physical (astronomic) and geometric (geodetic) quantities. It is critical in such areas as datum transformation, reduction of astronomic observation to the geodetic reference surface, geoid modelling and geophysical prospecting. Although the deflection of the vertical is a physical property of the gravitational field of [...] Read more.
The deflection of the vertical is an important parameter that combines both physical (astronomic) and geometric (geodetic) quantities. It is critical in such areas as datum transformation, reduction of astronomic observation to the geodetic reference surface, geoid modelling and geophysical prospecting. Although the deflection of the vertical is a physical property of the gravitational field of the earth; which almost all terrestrial survey measurements, with the exception of spatial distances, made on the earth surface are with respect to the Earth’s gravity vector, because a spirit bubble is usually used to align survey instruments. It has been ignored in most geodetic computation and adjustment. This research work is therefore aimed at computing the component of the deflection of the vertical component for part of Rivers State using a geometric method. This method involves the integration of Global Positioning System (GPS) to obtain the geodetic coordinate of points, precisely levelling to obtain the orthometric height of this point located within the study area. By least square using MATLAB program, the estimated deflections of vertical component parameters for the test station SVG/GPS-002 were; -0.0473” and 0.0393” arc seconds for the north-south and east-west components respectively. The associated standard errors of the North-south and East-west components were ±0.0093” and ±0.0060” arc seconds, respectively. The deflection of the vertical was also computed independently from gravimetric models of the earth as: ξ = 0.0204” ±0.0008814”, η = -0.0345” ±0.0014”; ξ =0.0157” ±0.000755”, η = -0.0246” ±0.0012”; ξ = -0.0546±0.0006014, η = -0.0208±0.0006014 for EGM 2008, EGM 1996 and EGM 1984 respectively. The two-tailed hypothesis test reveals that the estimated deflection component is statistically correct at 95% confidence interval. It was observed that the effect of the deflection of the vertical is directly proportional to the distance of the geodetic baseline. Therefore, including the derived component of deflection of the vertical to the ellipsoidal model will yield high observational accuracy since an ellipsoidal model is not tenable due to its far observational error in the determination of high-quality job. It is important to include the determined deflection of the vertical component for Rivers State, Nigeria.
Figures
PreviousNext
Article
Open Access September 30, 2021

Synthesis, Characterization and Catalytic Application of Magnetic Iron Nanoparticles (Fe3o4) in Biodiesel Production from Mahogany (Khaya Senegalensis) Seed Oil

Abstract Magnetic iron nanoparticles (Fe3O4) were synthesized and characterized using Fourier Transformed Infrared ((FT-IR), UV-Visible spectrophotometer, Scanned Electron Microscopy (SEM) equipped with an Energy Dispersive X-ray spectrometer (EDX), and X-ray Diffraction (XRD). The synthesized nano catalyst was used in the transesterification of mahogany seed oil with methanol. The [...] Read more.
Magnetic iron nanoparticles (Fe3O4) were synthesized and characterized using Fourier Transformed Infrared ((FT-IR), UV-Visible spectrophotometer, Scanned Electron Microscopy (SEM) equipped with an Energy Dispersive X-ray spectrometer (EDX), and X-ray Diffraction (XRD). The synthesized nano catalyst was used in the transesterification of mahogany seed oil with methanol. The optimized reaction conditions gave a reaction yield of 88% at a catalyst concentration of 1.5% wt., a volume ratio of methanol to oil of 5:1, a reaction temperature of 60 °C, and a reaction time of 120 minutes. The Fe3O4 nanoparticles was regenerated from the mixture and reused for various circles by applying the optimum conditions obtained during the present study. The results showed that the biodiesel yield decreased by increasing the number of cycles when the regenerated catalyst was used. However, good conversion (81.9%) was obtained up to the 5th cycles. The elemental analysis of the synthesized magnetic iron nanoparticles Fe3O4) revealed the highest proportion of iron with 64.37 and 74.40% for atomic and weight concentration respectively, followed by oxygen with 34.27 and 24.50% for atomic and weight concentrations respectively. It could be concluded that the synthesized nano catalyst would serve as an excellent catalyst for the transesterification of vegetable oils.
Figures
PreviousNext
Article
Open Access November 16, 2023

Innovations in Agricultural Machinery: Assessing the Impact of Advanced Technologies on Farm Efficiency

Abstract Progress in the development and adoption of technological innovations is instrumental in enhancing the efficiency of production systems across the globe. Through the introduction of cost-efficient and high-performing technologies, countries can both reduce the resource use intensity of their economies and boost the global supply of essential products. The focus of this study is to analyze the [...] Read more.
Progress in the development and adoption of technological innovations is instrumental in enhancing the efficiency of production systems across the globe. Through the introduction of cost-efficient and high-performing technologies, countries can both reduce the resource use intensity of their economies and boost the global supply of essential products. The focus of this study is to analyze the application of advanced machinery and mechanisms within the agricultural sector, a primary industry that acts as a major contributor to the gross domestic product (GDP) of many nations. Specifically, this paper provides an in-depth review of the latest impact assessments based on analytical and modeling tools conducted on agricultural machinery and production technologies. Our findings highlight the positive role played by scientific progress and innovation in driving the competitiveness, growth and improved sustainability of the agricultural sector. Over the years, advanced technologies have accelerated the development and modernization of machinery, equipment, and processes in farming. Typically, modern machinery and equipment have enabled large-scale production on farms, enhancing the cost-efficient use of both land and labor, as well as the capacity and timeliness in performing essential agricultural operations. The rapid diffusion of technical advancements has further contributed to resource savings, productivity growth, and the overall transformation of agricultural value chains. Accordingly, the implementation of appropriate enabling conditions is of vital importance in encouraging the widespread integration of technologies in agriculture, not only boosting productivity along the agri-food chain but also yielding widespread social, economic, and environmental benefits.
Figures
PreviousNext
Review Article
Open Access November 16, 2023

Zero Carbon Manufacturing in the Automotive Industry: Integrating Predictive Analytics to Achieve Sustainable Production

Abstract This charge-ahead paper suggests that transitioning the automotive industry towards a zero-carbon ecosystem from material to end-of-life can be accomplished through disruptive zero-carbon manufacturing in the broad area of all-electric vehicle production technology. To accomplish zero carbon emission automotive manufacturing in the vehicle assembly domain, future paradigms must converge on the [...] Read more.
This charge-ahead paper suggests that transitioning the automotive industry towards a zero-carbon ecosystem from material to end-of-life can be accomplished through disruptive zero-carbon manufacturing in the broad area of all-electric vehicle production technology. To accomplish zero carbon emission automotive manufacturing in the vehicle assembly domain, future paradigms must converge on the decoupling of carbon dioxide emissions from automobile manufacturing and use the design, processing, and manufacturing conditions. The envisioned zero carbon emission vehicle manufacturing domain consists of two complementary components: (a) making more efficient use of energy and (b) reducing carbon in energy use. This paper presents the status of key scientific and technological advancements to bring the manufacturing model of today to a zero-carbon ecosystem for the entire automotive industry of tomorrow. This paper suggests the groundbreaking application of dynamic and distributed predictive scheduling algorithms and open sensing and visualization technology to meet the zero carbon emission vehicle manufacturing goals. Power-aware high-performance computing clusters have recently become a viable solution for sustainable production. Advances in scalable and self-adaptive monitoring, predictive analytics, timeline-based machine learning, and digital replica of cyber-physical systems are also seen co-evolving in the zero carbon manufacturing future. These methods are inspired by initiatives to decouple gross domestic product growth and energy-related carbon dioxide emissions. Stakeholders could co-design and implement shared roadmaps to transition the automotive manufacturing sector with relevant societal and environmental benefits. The automated mobility sector offers a program, an industry-leading example of transforming an automotive production facility to carbon neutrality status. The conclusions from this paper challenge automotive manufacturers to engage in industry offsetting and carbon tax programs to drive continuous improvement and circular vehicle flows via a multi-directional zero-carbon smart grid.
Figures
PreviousNext
Review Article
Open Access December 27, 2023

Understanding the Fundamentals of Digital Transformation in Financial Services: Drivers and Strategic Insights

Abstract The current financial services sector is realising considerable changes in its operations due to development in technology and embracing of digital platforms. This evolution is changing the established concepts of business, consumers and channels of delivery of services. Financial services firms are changing the way they work through digital transformation due to developments in technology, [...] Read more.
The current financial services sector is realising considerable changes in its operations due to development in technology and embracing of digital platforms. This evolution is changing the established concepts of business, consumers and channels of delivery of services. Financial services firms are changing the way they work through digital transformation due to developments in technology, changes in customer needs, and an increase in emphasis on sustainability. Understanding the opportunities, risks, and new trends in digital transformation is the focus of this paper. Opportunities include efficient real-time decision-making processes, increased transparency and better process controls, which are balanced by the threats of change management, dubious organization-technology fit, and high implementation costs. The study also examines recent advancements, including the application of machine learning and artificial intelligence, developments in mobile and online banking, integration of blockchain, and increasing focus on security and personalised banking. A literature review yields some findings from different studies on rural financial services, the evolution of the blockchain, drivers of digital transformation, cloud-based learning approaches, and emerging sustainability practices. All of these results suggest that more strategic planning, analytics, and more focus on ensuring that organisational objectives are met with transformations should be pursued. Hence, this research findings add to the existing literature in determining how innovative and digital technologies are likely to transform the financial services sector and advance sustainability.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

The Role of Neural Networks in Advancing Wearable Healthcare Technology Analytics

Abstract Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical [...] Read more.
Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical healthcare technology, crawling through some industry giants. Wearable Healthcare Technologies are becoming more popular every day. These technologies facilitate collecting, monitoring, and sharing every vital aspect of the human body necessary for diagnosing and treating an ailment. At the advent of global digitization, health data storage and systematic analysis are taking shape to ensure better diagnostics, preventive, and predictive healthcare. Healthcare analytics powered by neural networks can significantly improve health outcomes, maximizing individuals' potential and quality of life. The breadth and possibilities of connected devices are getting wider. From personal activity monitoring to quantifying every bit of health statistics, connected devices are making an impact in measurement, management, and manipulation. In healthcare, early diagnosis could be a lifesaver. Data analytics can help in a big way to make moves and predictions to save lives. We are in another phase of the digitization era, "Neural Network and Wearable Healthcare Technology Analytics." A neural network could be conceived as an adaptive system made up of a large number of neurons connected in multiple layers. A neural network processes data in a similar way as the human brain does. Using a collection of algorithms, for many neural networks, objects are composed of 'input' and 'output' layers along with the layers of the neural network.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Revolutionizing Patient Care and Digital Infrastructure: Integrating Cloud Computing and Advanced Data Engineering for Industry Innovation

Abstract This work details how the integration of cloud computing and advanced data engineering can innovate and reshape patient care and digital infrastructure. In the healthcare sector, cloud services offer the necessary support to generate digitally-oriented services and service kits. These services can contain high levels of availability, low levels of latency, and on-demand scaling capabilities, while [...] Read more.
This work details how the integration of cloud computing and advanced data engineering can innovate and reshape patient care and digital infrastructure. In the healthcare sector, cloud services offer the necessary support to generate digitally-oriented services and service kits. These services can contain high levels of availability, low levels of latency, and on-demand scaling capabilities, while following the strictest data protection laws and regulations. On the other hand, these services can be combined with data engineering techniques to construct an ecosystem that enhances and adds an optimized data layer on any cloud environment. This ecosystem includes technologies to acquire, process, and manage healthcare data while respecting all regulatory obligations and institutions and can be part of a comprehensive digitalization strategy. The objective is to augment the healthcare services that the industry offers by leveraging healthcare data and AI technologies. Designed services, processes, and technologies can be described either as industry-agnostic services or healthcare-specific services that process and manage electronic healthcare records (EHR). Industry-agnostic services offer a set of tools and methodologies to conduct optimized data experiments. The goal is to exploit any variety, velocity, volume, and veracity of medical data. Healthcare-specific services offer a set of tools and methodologies to connect to any common EHR vendor in a privacy-preserving manner. Participating companies are thus able to hold, share, and make use of healthcare data in real-time. The proposed architecture can be transformative for the healthcare industry, opening up and facilitating experimentation on new and scalable service models. The transition to a more digital health approach would help overcome the limits encountered in traditional settings. Limitations in the availability of healthcare facilities and healthcare professionals have underpinned the increasing share of telemedicine in the care process. However, the record-keeping of the patients that undergo care outside of traditional healthcare facilities is often missing and can severely influence the continuity of treatment. Identifying new methods to implement disease prevention and early intervention processes is crucial to avoid more extensive treatment and to support those on multiple line therapies. For chronic patients, having a service available that monitors the state of health and intervenes when parameters go off the wanted range is crucial. However, the same patients are the most under the influence of the decision of care providers; a second opinion might be given remotely which the patient can access at any time on-demand. To address these different kinds of services, an ecosystem composed of a dictionary's worth data layer is outlined, able to live and operate seamlessly in any cloud environment. This future work's envisioned outcome is the rapid evolution and re-definition of the European healthcare landscape.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Data-Driven Innovation in Finance: Crafting Intelligent Solutions for Customer-Centric Service Delivery and Competitive Advantage

Abstract Innovations in computing and communication technologies are reshaping finance. The seismic changes are casting uncertainty about the future of financial services. On one hand, fintech evangelists project a rosy future, asserting that the fast-moving algorithms can deliver low-cost financial services intuitively, customized to meet robust consumer expectations. On the other hand, many finance [...] Read more.
Innovations in computing and communication technologies are reshaping finance. The seismic changes are casting uncertainty about the future of financial services. On one hand, fintech evangelists project a rosy future, asserting that the fast-moving algorithms can deliver low-cost financial services intuitively, customized to meet robust consumer expectations. On the other hand, many finance veterans fret that the traditional banking model could disintermediate, bleeding banks via a ‘death by a thousand cuts’, reducing them to passive portfolio holders with no direct customer relationship, eclipsed by digital giants which use their enormous treasure troves of customer data to offer banking as an added service with nearly free cost. Amidst the upbeat technological promises and apocalyptic forebodings, there are two constant, mostly agreed-upon, truths. The first is the vital importance of data. Advances in the internet, cloud computing, and record-keeping technologies are producing an ‘exponential growth in the volume and detail of data’. Some of this big data are personal information. Smartphones are deployed in almost all developed and emerging economies, serving as little spies; tracking, recording location histories, social networks, and app usage of their unsuspecting owners; often with a great degree of precision. ‘People are walking data-factories’ in this ‘mobile digital society’. Data are the fermentation of these global exchanges, electronic commerce and communication, and financial transactions. To just take Facebook as an example, it shares 30 million people a day through updates and posts, hosting personal information on 2.23 billion users. To the alarm of the uninformed public, much of this information is available for commercial harvest. The second constant is the rise of intelligent solutions. Consumers today—be it disclosed or not—are fed tailored clothes, music, film, holiday packages—almost anything you like, notably dynamic pricing, varying in accordance with individual profiles, or personalized search results. The availability of powerful computers has enabled comparable applications that are intended to make the system more responsive to their customer profiles and desires, or to capitalize competitive business possibilities. Such changes will transform the financial industry and occupy a prominent position among the mechanisms of policy competition, reshaping the way in which financial services are bestowed and led on the demand side.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Advancing Healthcare Innovation in 2021: Integrating AI, Digital Health Technologies, and Precision Medicine for Improved Patient Outcomes

Abstract Advances of wearables, sensors, smart devices, and electronic health records have generated patient-oriented longitudinal data sources that are analyzed with advanced analytical tools to generate enormous opportunities to understand patient health conditions and needs, transforming healthcare significantly from conventional paradigms to more patient-specific and preventive approaches. Artificial [...] Read more.
Advances of wearables, sensors, smart devices, and electronic health records have generated patient-oriented longitudinal data sources that are analyzed with advanced analytical tools to generate enormous opportunities to understand patient health conditions and needs, transforming healthcare significantly from conventional paradigms to more patient-specific and preventive approaches. Artificial intelligence (AI) with a machine learning methodology is prominently considered as it is uniquely suitable to derive predictions and recommendations from complex patient datasets. Recent studies have shown that precise data aggregation methods exhibit an important role in the precision and reliability of clinical outcome distribution models. There is an essential need to develop an effective and powerful multifunctional machine learning platform to enable healthcare professionals to comprehend challenging biomedical multifactorial datasets to understand patient-specific scenarios and to make better clinical decisions, potentially leading to the optimist patient outcomes. There is a substantial drive to develop the networking and interoperability of clinical systems, the laboratory, and public health. These steps are delivered in concert with efforts at enabling usefully analytic tools and technologies for making sense of the eruption of overall patient’s information from various sources. However, the full efficiency of this technology can only be eliminated when ethical, legal, and social challenges related to reducing the privacy of healthcare information are successfully absorbed. Public and media are to be informed about the capabilities and limitations of the technologies and the paramount to be balanced is juvenile public healthcare data privacy debate. While this is ongoing, the measures have been progressed from patient data protection abuses for progress to realize the full potential of AI technology for hosting the health system, with benefits for all stakeholders. Any protection program should be based on fairness, transparency, and a full commitment to data privacy. On-going innovative systems that use AI to manage clinical data and analyzes are proposed. These tools can be used by healthcare providers, especially in defining specific scenarios related to biomedical data management and analysis. These platforms ensure that the significant and potentially predictive parameters associated with the diagnosis, treatment, and progression of the disease have been recognized. With the systematic use of these solutions, this work can contribute to the realization of noticeable improvements in the provision of real-time, personalized, and efficient medicine at a reduced cost [1].
Figures
PreviousNext
Case Report
Open Access December 27, 2021

Revolutionizing Risk Assessment and Financial Ecosystems with Smart Automation, Secure Digital Solutions, and Advanced Analytical Frameworks

Abstract For years, risk assessment and financial calculations have been based on mathematical, statistical, and actuarial studies of existing and historical data. The manual process of building datasets, processing data, deriving trends, identifying periodicities, and analyzing diagnostics is extremely expensive and time-consuming. With the automation and evolution of data science technologies, [...] Read more.
For years, risk assessment and financial calculations have been based on mathematical, statistical, and actuarial studies of existing and historical data. The manual process of building datasets, processing data, deriving trends, identifying periodicities, and analyzing diagnostics is extremely expensive and time-consuming. With the automation and evolution of data science technologies, organizations are now bringing in niche data, such as unstructured data, which contain more disruptive and precise signals for decision-making—thereby making predictions and derivative valuations more robust. This discussion highlights how investment decision-making and financial ecosystem activities are set to be transformed with the power of technical automation, data, and artificial intelligence. A noted trend in the financial investment sector is that financial valuations are highly predictive and highly non-linear in long-term occurrences. To understand these robust evolving signals and execute profitable strategies upon them, the investment management process needs to be very dynamic, open, smart, and technically deep. However, with current manual processes, reaching a high-end asset prediction still seems like a shot in the dark. In parallel, open and democratically developed financial ecosystems query relatively riskless premium opportunities in high-finance valuation and perception. The process of evolving financial ecosystems or the use of automated tools and data to move to unique frontiers could make high-yield profiting opportunities very safe and entirely riskless. Financial economic theories and realistic approximation models support this.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Innovative Financial Technologies: Strengthening Compliance, Secure Transactions, and Intelligent Advisory Systems Through AI-Driven Automation and Scalable Data Architectures

Abstract Through a digitally connected ecosystem, the innovative realm of fintech significantly enhances human capabilities across various dimensions. AI-based fintech solutions are increasingly proving to be invaluable by providing effective enforcement of regulations that ensure compliance and protect stakeholders involved. Numerous expert investigations conducted in the arena of high-technology [...] Read more.
Through a digitally connected ecosystem, the innovative realm of fintech significantly enhances human capabilities across various dimensions. AI-based fintech solutions are increasingly proving to be invaluable by providing effective enforcement of regulations that ensure compliance and protect stakeholders involved. Numerous expert investigations conducted in the arena of high-technology litigation have reinforced both the pressing need and the immense value of enforced compliance in today's fast-paced digital landscape. Open banking APIs have boldly pioneered this critical regulatory enforcement role, allowing broader access and improved services for consumers. Predictive AI certainty, facilitated through sophisticated validation systems, represented a fundamental evolution in their rule-based legal formulations that govern many aspects of financial transactions. These advanced products were deployed within global legislative codes, allowing for standardized practices, and consequently, all market sectors quickly adopted them to ensure they remain competitive and compliant. During the latest of these professionals' encouraging comments, it became clear that awareness of the inception of these groundbreaking innovations must be convened into a steadfast commitment to continue launching natural language processing products that can refine consumer interaction. Since this pivotal point, the increasing dependency of the financial expert community on these incisive factors underscores the paramount importance they now hold for their clients and end users alike, shaping the future of finance in profound ways [1].
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Transforming the Retail Landscape: Srinivas’s Vision for Integrating Advanced Technologies in Supply Chain Efficiency and Customer Experience

Abstract Technological advances have had a transformative impact on the retail landscape. Challenges arise with guaranteeing technological changes lead to, rather than detract from, increased efficiency and positive experiences. First, integrating technology into the supply chain in an aggressive way is costly. It requires vast changes to existing systems and developments of cross-industry communication [...] Read more.
Technological advances have had a transformative impact on the retail landscape. Challenges arise with guaranteeing technological changes lead to, rather than detract from, increased efficiency and positive experiences. First, integrating technology into the supply chain in an aggressive way is costly. It requires vast changes to existing systems and developments of cross-industry communication protocols. Secondly, the public is often quick to reject technological changes or slow to become users. Finally, ensuring that technological advancements do not only benefit the top few retailers and are accessible to those of any size poses a challenge, as has been seen in the fate of only a handful of radical changes in retail technology. On the other hand, an integral aspect of technology, particularly that used for big data collection and processing, is that it can account for these and other variables. It can predict the success of ventures into modernizing or developing new systems and can identify more effective and efficient ways to do so. Of course, the concerns of job loss or technological monopoly still loom. But, it would seem, the continued advancement of technology in the retail landscape is inevitable.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Optimizing Unclaimed Property Management through Cloud-Enabled AI and Integrated IT Infrastructures

Abstract With unclaimed property assets reaching record levels, businesses have become, in some cases, overwhelmed and hamstrung by stagnant, unoptimized processes. That sentiment is compounded by ever-evolving regulatory changes, resulting in organizations struggling to hit compliance deadlines while delivering an optimal claimant experience. Often, early systems had periods of short-term success but are [...] Read more.
With unclaimed property assets reaching record levels, businesses have become, in some cases, overwhelmed and hamstrung by stagnant, unoptimized processes. That sentiment is compounded by ever-evolving regulatory changes, resulting in organizations struggling to hit compliance deadlines while delivering an optimal claimant experience. Often, early systems had periods of short-term success but are on the verge of obsolescence, resulting in stressed workflows and cumbersome integrations. Deploying an integrated IT infrastructure, supported by cloud-enabled AI, represents the quickest path to modernizing unclaimed property management. A fully integrated IT infrastructure is crucial to optimize the management of unclaimed property [1]. When lone solutions exist across an organization, companies miss out on automation opportunities generated through the interconnectedness of systems and data. AI presents organizations with the opportunity to traverse these gaps, enabling a vast library of applications to improve the perturbed workflows of unclaimed property teams. Automated data extraction, document comparison, fraudulent claim detection, and workflow completion analysis are just a few popular applications well suited for the unclaimed property space. In addition to the lagging technology currently deployed by many organizations, the unclaimed property landscape itself is evolving. Compliance issuance, asset availability, rates, the ability to collect fraudulently posted claims, and the claimant experience have all become hot-button items that are now front of mind for regulation agencies and businesses alike. Issuing duplication letters in a compliant manner, accommodating claimant inquiries regarding held assets, and managing, processing, and understanding the operational impact of rate changes are vexing problems many organizations now find themselves playing catch-up to address. The opportunity posed by cloud-enabled AI is furthered by economic, regulatory, and report cycle pressures on unclaimed property teams to do more with the same size or fewer resources. It’s now no longer simply a case of hitting the audit date deadline and checking off a box but an emerging priority for businesses at all sides of the market, from Fortune 500 to mid-market firms. In-house shared service teams are comfortable in areas of monitoring and curating business data; however, unclaimed property is an unknown territory with a learning curve, compliance gaps, and operational holes that, if ignored, stand to scale up exponentially. The combined fallout from regulatory changes and the recent pandemic have only made the situation riskier, with increased volatility in balancing time-sensitive tasks against stringent regulatory deadlines and growing claimant outreach.
Figures
PreviousNext
Review Article
Open Access December 29, 2020

Enhancing Government Fiscal Impact Analysis with Integrated Big Data and Cloud-Based Analytics Platforms

Abstract While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that [...] Read more.
While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that facilitates data retrieval and analytics, as well as policy modelling, creation and optimization. The environment enables data collection from heterogeneous sources, linking and aggregation, complemented with data cleaning and interoperability techniques. An innovative approach for analytics as a service is introduced and linked with a policy development toolkit, which is an integrated web-based environment to fulfil the requirements of the public policy ecosystem stakeholders [1]. Large information databases on various public issues exist, but their usage for public policy formulation and impact analysis has been limited so far, as no cloud-based service ecosystem exists to facilitate their efficient exploitation. With the increasing availability and importance of both public big and traditional data, the need to extract, link and utilize such information efficiently has arisen. Current data-driven web technologies and models are not aligned with the needs of this domain, and therefore, potential candidates for big data, cloud-based and service-oriented public policy analysis solutions should be investigated, piloted and demonstrated [2]. This paper presents the conceptual architecture of such an ecosystem based on the capabilities of state-of-the-art cloud and web technologies, as well as the requirements of its users.
Figures
PreviousNext
Review Article
Open Access November 24, 2022

Bridging Traditional ETL Pipelines with AI Enhanced Data Workflows: Foundations of Intelligent Automation in Data Engineering

Abstract Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data [...] Read more.
Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data Engineering and Automation framework offers the groundwork for intelligent automation processes. However, ML/AI are not the only disruptive forces; new Big Data technologies inspired by Web2.0 companies are also reshaping the Internet. Companies having the largest Big Data footprints not only provide applications with a Big Data operational model but also source their competitive advantage from data in the form of AI services and, consequently, impact the cost/performance equilibrium of ETL pipelines. All these technologies and reasons help explain why the traditional ETL pipeline design should adapt to current and emerging technologies and may be enhanced through artificial intelligence.
Figures
PreviousNext
Article
Open Access December 24, 2022

Cloud Native ETL Pipelines for Real Time Claims Processing in Large Scale Insurers

Abstract Cloud native ETL pipelines support the extract and transform phases of real time claims processing in large scale insurers. The cloud native approach offers dramatic improvements in scalability, reliability, resiliency and agility as well as seamless integration with the diverse set of data sources, destinations and technologies characteristic of large scale insurers. The ETL process extracts data [...] Read more.
Cloud native ETL pipelines support the extract and transform phases of real time claims processing in large scale insurers. The cloud native approach offers dramatic improvements in scalability, reliability, resiliency and agility as well as seamless integration with the diverse set of data sources, destinations and technologies characteristic of large scale insurers. The ETL process extracts data from source systems such as core transaction, fraud, customer and accounting processes, transforms the data to create a usable format for analytics and other applications, and loads the resulting tables into business intelligence or data lake systems for subsequent storage and analysis. By addressing these two phases of the overall ETL process, cloud native ETL pipelines can provide timely, reliable and consistent data to data scientists, actuaries, underwriters and other analysts. Real time processing represents a key priority within the overall claims process: faster, more accurate claim approvals reduce insurer costs, improve customer service and enhance premium pricing. As a result, a variety of claims related use cases are moving from batch to real time.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

Digital Transformation in Insurance: Migrating Enterprise Policy Systems to .NET Core

Abstract Migrating enterprise policy systems to .NET Core is a key objective of digital transformation in the Insurance IT ecosystem. This change directly addresses strategic drivers: enabling adoption of cloud-first development, resisting market pressure for more flexible and usable enterprise solutions, and preparing for changing demands from regulation and compliance. Phases of operational benefit [...] Read more.
Migrating enterprise policy systems to .NET Core is a key objective of digital transformation in the Insurance IT ecosystem. This change directly addresses strategic drivers: enabling adoption of cloud-first development, resisting market pressure for more flexible and usable enterprise solutions, and preparing for changing demands from regulation and compliance. Phases of operational benefit aligned with risk mitigation form the basis of the migration roadmap, with a strong focus on engaging all relevant stakeholders. Market pressure for a SEAMLESS user experience across ALL applications is a fundamental driver for Investment in digital transformation. Gaps remain in enterprise Operations, where Legislative and regulatory accountability Demand rigid and complex solutions that Liberty has not yet been able to provide. New risk-based capital requirements, Data-Sovereignty controls, Controls for sensitive Data in the Cloud, and new Audit requirements create a long list of challenges for the ecosystem that can no longer be Deferred. At the same time, Cross-organisational integration is becoming more important and integrating partners from the insurance supply-chain requires a much more flexible approach to development and Deployment. These factors combine to generate a credible case for accelerated digital investment with a focus on Migration to Cloud Platforms, with related Risk mitigation, Quality Improvements, and flexibility benefits that close Industry gaps.
Figures
PreviousNext
Review Article
Open Access December 21, 2021

Optimizing Data Warehousing for Large Scale Policy Management Using Advanced ETL Frameworks

Abstract Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the [...] Read more.
Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the need for data warehousing. Next, an overview of an ETL framework is presented, along with a discussion of advanced ETL techniques. The chapter concludes with an outline of performance optimization techniques for data warehousing. Data warehousing is considered a key enabler for efficient reporting and analysis, with implementation choices ranging from cost-effective desktop systems to large-scale, mission-critical data marts and warehouses containing petabytes of data. Extract, transform, and load (ETL) systems remain one of the largest cost and effort areas within data warehouse development projects, requiring significant planning and resources to build, manage, and monitor the flow of data from source systems into the data warehouse. The technology and techniques used for ETL can greatly influence the success or failure of a data warehouse. Complex business requirements for data cleansing, loading, transformation, and integration have intensified, while operational plans for real-time and near-real-time reporting add additional challenges. Parallel loading mechanisms, incremental data loading, and runtime update and insert strategies not only improve ETL performance but also optimize data warehousing performance, particularly for large-scale policy management.
Figures
PreviousNext
Article
Open Access December 18, 2021

A Comparative Study of Traditional Reporting Systems versus Real-Time Analytics Dashboards in Enterprise Operations

Abstract Seamless integration of information in organizations promotes not only the operational efficiency but also the quality of decisions made by managers. Real-time decision support systems enable organizations to evaluate organizational changes immediately and ideally gives a hint of problems before they even appear in the organization. Such real time systems are nowadays regarded as the front-line [...] Read more.
Seamless integration of information in organizations promotes not only the operational efficiency but also the quality of decisions made by managers. Real-time decision support systems enable organizations to evaluate organizational changes immediately and ideally gives a hint of problems before they even appear in the organization. Such real time systems are nowadays regarded as the front-line solutions for managing organizations effectively. The technological possibilities seem not to conquer management. For most companies the data is still dealt with traditional solutions, data is collected and reports are generated to evaluate the past occurrences which only gives information on what has happened in the organization. The problem with these non-real-time systems is the reflection of organizational condition very late. These are the common rear-mirror descriptions for what already has been. Managers are receiving information from their organizations too late and often too little to make optimal decisions. Is it not possible to manage operations in real-time? Is real-time decision support really needed? If so, why most organizations still rely on traditional reporting systems.
Figures
PreviousNext
Review Article
Open Access December 20, 2024

AI for Time Series and Anomaly Detection

Abstract Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent [...] Read more.
Time series data are increasingly prevalent across domains such as finance, healthcare, manufacturing, and IoT, making accurate forecasting and anomaly detection critical for decision-making and system reliability. Traditional statistical methods (e.g., ARIMA, Holt-Winters) often fail to capture complex temporal dependencies and high-dimensional interactions inherent in modern time series. Recent advances in artificial intelligence particularly deep learning architectures such as recurrent neural networks (RNNs), convolutional neural networks (CNNs), temporal convolutional networks (TCNs), graph neural networks (GNNs) and Transformers have demonstrated marked improvements in modeling both univariate and multivariate series, as well as in detecting anomalies that deviate from learned norms (Darban, Webb, Pan, Aggarwal, & Salehi, 2022; Chiranjeevi, Ramya, Balaji, Shashank, & Reddy, 2024) [1,2]. Moreover, ensemble techniques and hybrid signal-processing + deep-learning pipelines show enhanced sensitivity and adaptability in real-world anomaly detection scenarios (Iqbal, Amin, Alsubaei, & Alzahrani, 2024) [3]. In this work, we provide a unified survey and comparative analysis of AI-driven time series forecasting and anomaly detection methods, highlight key industrial application domains, evaluate performance trade-offs (e.g., accuracy vs. latency, supervised vs. unsupervised learning), and discuss emerging challenges including interpretability, data drift, real-time deployment on edge devices, and integration of causal reasoning. Our findings suggest that while AI approaches significantly outperform classical techniques in many settings, careful consideration of data characteristics, evaluation metrics and deployment environment remains essential for effective adoption.
Article
Open Access December 18, 2023

Leveraging AI, ML, and Generative Neural Models to Bridge Gaps in Genetic Therapy Access and Real-Time Resource Allocation

Abstract This paper leverages gene and cell therapy research in diverse disorders ranging from monogenic to infectious diseases to cancer and emerging breakthroughs, where one can harness individual genes or a synthetic gene sequence designed based on a shared molecular pattern in infected cells to better fight various disorders [1]. A pivotal task is to predict the performances of candidate gene therapies [...] Read more.
This paper leverages gene and cell therapy research in diverse disorders ranging from monogenic to infectious diseases to cancer and emerging breakthroughs, where one can harness individual genes or a synthetic gene sequence designed based on a shared molecular pattern in infected cells to better fight various disorders [1]. A pivotal task is to predict the performances of candidate gene therapies to guide clinical translational research using methods such as retrospective bioinformatic analyses. Implementing them to a large-scale gene therapy database reveals that it is feasible to construct and apply well-performing interpretable, supervised learning models [2]. Preliminary evidence of machine learning approaches' statistical significance helps clinicians and biomedical researchers, market participants, and regulatory and economic experts derive relevant, practical applications, thereby enhancing the deployment of gene therapy and genomics to achieve positive, long-term growth for humanity while alleviating the ongoing worldwide economic burden precipitated by prolonged and recurring diseases. Deploying machine learning techniques to accelerate gene and cell therapy drug development and trials shall also mitigate the existing obstacle of limited patient access to emerging, transformative medical innovations such as gene therapy due to skyrocketing prices, which often herald gene therapy products as the world's most expensive medicines [3]. Moreover, in preventing patients from accessing effective, life-saving genetic medicines, there commonly exists a multidimensional access gap encompassing the availability, affordability, and quality or acceptability of these clinical treatments. The ensuing substantial gap has repeatedly been documented and mainly emanates from differential institutional and socio-political choices around resource allocation at international and domestic levels [4]. Particularly, it is also due to the stringent licensure and regulatory approval processes underpinned by insufficient evidence for novel safety and clinical efficacy profiles for genetic therapies in multiple micro-local diagnoses and subpopulations. We believe that a higher likelihood of gene therapy adoption shall result when the clinical evidence path contains adequate representation from the most diverse and relevant patient populations [5].
Figures
PreviousNext
Case Report

Query parameters

Keyword:  Transform

View options

Citations of

Views of

Downloads of