Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access June 25, 2025

Performance and Validity of Knee Function Assessment Tools After Total Knee Arthroplasty: A Systematic Review

Abstract Objective: To identify and evaluate the main functional assessment tools applied in the postoperative monitoring of patients undergoing total knee arthroplasty (TKA), and to synthesize the functional outcomes reported through these instruments in the current scientific literature. Methodology: A structured review was conducted following PRISMA 2020 guidelines. [...] Read more.
Objective: To identify and evaluate the main functional assessment tools applied in the postoperative monitoring of patients undergoing total knee arthroplasty (TKA), and to synthesize the functional outcomes reported through these instruments in the current scientific literature. Methodology: A structured review was conducted following PRISMA 2020 guidelines. Thirty-one peer-reviewed studies were selected through a targeted manual search based on predefined eligibility criteria. Included studies evaluated functional recovery following TKA using validated outcome measures such as the WOMAC, KSS, KOOS, IKDC, SF-36, and SANE. Data extraction focused on the instruments used, patient population characteristics, and reported outcomes. A descriptive synthesis was compiled in Table 1. Additionally, 15 studies with quantitative data were analyzed using a forest plot to illustrate risk ratios (RR) and 95% confidence intervals (CI) for functional improvement. Risk of bias was assessed qualitatively based on methodological rigor, clarity of reporting, and validation of the outcome tools. Results: All included studies reported improvements in functional status following TKA. Most risk ratios ranged from 0.66 to 0.85, indicating a consistent reduction in the risk of postoperative functional limitation. High-quality studies demonstrated more precise effect estimates and greater internal validity. The SANE scale emerged as a valid and practical tool with high responsiveness, including in its culturally adapted Brazilian version. Despite heterogeneity in study design, the direction of effect remained consistent across all included studies. Conclusion: Validated functional assessment tools are essential for monitoring recovery after total knee arthroplasty. Instruments such as WOMAC and SANE demonstrate strong clinical utility and psychometric validity. Their systematic use enhances outcome comparability, supports individualized rehabilitation planning, and improves decision-making in orthopedic care.
Figures
PreviousNext
Systematic Review
Open Access March 03, 2025

Effectiveness and Safety of Acupuncture Combined with Bloodletting Cupping Therapy in the Treatment of Scapulohumeral Periarthritis: A Systematic Review and Meta-Analysis

Abstract Background: Scapulohumeral periarthritis commonly afflicts individuals in their middle age. Its etiology is multifaceted, and treatment presents a challenge with a high risk of recurrence. The main symptoms include shoulder pain and limited joint mobility, seriously affect the quality of life of the patients. Recent research indicate that acupuncture combined with bloodletting cupping can [...] Read more.
Background: Scapulohumeral periarthritis commonly afflicts individuals in their middle age. Its etiology is multifaceted, and treatment presents a challenge with a high risk of recurrence. The main symptoms include shoulder pain and limited joint mobility, seriously affect the quality of life of the patients. Recent research indicate that acupuncture combined with bloodletting cupping can significantly improve the function of activity of shoulder joint and the pain in individuals with scapulohumeral periarthritis. However, these studies have typically been limited in scope, therefore additional research to substantiate the efficacy and safety of these intervention. Methods: To evaluate the efficacy of acupuncture combined with bloodletting cupping for treating patients with scapulohumeral periarthritis. We conducted an online search of databases in both Chinese and English, including PubMed, the Cochrane Library, Embase, Web of Science, CNKI, Wangfang Data, China Science and Technology Journal Database (VIP) and Chinese BioMedical Literature Database (CBM), to collect randomized controlled trials (RCTs) concerning the use of acupuncture combined with bloodletting cupping in scapulohumeral periarthritis patients. We also examined the references within the identified literature. Search utilised subject headings and free-text terms in both languages, without racial restrictions, for records up to April 3, 2024. Two researchers independently screened the literature, extracted data, and evaluated their qualities. RevMan 5.3 software was used for meta-analysis of the included studies. The protocol of this review was recorded in the International Platform of Registered Systematic Review and Meta-analysis Protocols (PROSPERO). Its registration number is CRD42023454614. Results: This review incorporated 22 RCTs involving a total of 1,774 patients. The results of meta-analysis showed that the clinical effective rate (RR=1.25, 95%CI [1.20, 1.30], P<0.00001) of treating scapulohumeral periarthritis with acupuncture combined with bloodletting cupping was higher in the experimental group than in the control group. The all of Visual Analogue Scale (VAS) score (MD=-1.70, 95% CI [-2.17, -1.22], P<0.00001). Melle score (SMD=-2.45, 95% CI [-2.55, -2.34], P=0.007]) and recurrence rate (RR=0.23, 95% CI [0.07, 0.77], P=0.02) were lower in the experimental group than in the control group with statistical significance (P<0.05). Conclusion: The acupuncture combined with bloodletting cupping for the treatment of shoulder impingement syndrome demonstrates definite efficacy and safety, with superior clinical effectiveness, pain relief, improvement in shoulder joint mobility, and reduction in recurrence compared to acupuncture alone. Therefore, it is worthy of being promoted and applied clinically.
Figures
PreviousNext
Meta-Analysis
Open Access January 11, 2025

Exploring LiDAR Applications for Urban Feature Detection: Leveraging AI for Enhanced Feature Extraction from LiDAR Data

Abstract The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is [...] Read more.
The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is crucial for enhancing urban development, environmental monitoring, and advancing smart city governance. LiDAR, known for its high-resolution 3D data capture capabilities, paired with AI, particularly deep learning algorithms, facilitates advanced analysis and interpretation of urban areas. This combination supports precise mapping, real-time monitoring, and predictive modeling of urban growth and infrastructure. For instance, AI can process LiDAR data to identify patterns and anomalies, aiding in traffic management, environmental oversight, and infrastructure maintenance. These advancements not only improve urban living conditions but also contribute to sustainable development by optimizing resource use and reducing environmental impacts. Furthermore, AI-enhanced LiDAR is pivotal in advancing autonomous navigation and sophisticated spatial analysis, marking a significant step forward in urban management and evaluation. The reviewed paper highlights the geometric properties of LiDAR data, derived from spatial point positioning, and underscores the effectiveness of machine learning algorithms in object extraction from point clouds. The study also covers concepts related to LiDAR imaging, feature selection methods, and the identification of outliers in LiDAR point clouds. Findings demonstrate that AI algorithms, especially deep learning models, excel in analyzing high-resolution 3D LiDAR data for accurate urban feature identification and classification. These models leverage extensive datasets to detect patterns and anomalies, improving the detection of buildings, roads, vegetation, and other elements. Automating feature extraction with AI minimizes the need for manual analysis, thereby enhancing urban planning and management efficiency. Additionally, AI methods continually improve with more data, leading to increasingly precise feature detection. The results indicate that the pulse emitted by continuous wave LiDAR sensors changes when encountering obstacles, causing discrepancies in measured physical parameters.
Figures
PreviousNext
Article
Open Access March 05, 2024

Phenolic compounds and antioxidant properties of roasted maize-peanut product (Zowey) and its potential to alleviate oxidative stress

Abstract Background: The study of phenolic compounds and their potential to contribute to health is a major interest in research. This work was to determine phenolic compound contents as well as antioxidant properties of roasted maize-peanut snack product with and without spices. Methods: HPLC was used to determine the phenolic composition of the maize flours, peanut flour and their composite [...] Read more.
Background: The study of phenolic compounds and their potential to contribute to health is a major interest in research. This work was to determine phenolic compound contents as well as antioxidant properties of roasted maize-peanut snack product with and without spices. Methods: HPLC was used to determine the phenolic composition of the maize flours, peanut flour and their composite snacks with and without spices. Total phenolic content (TPC), total flavonoid content (TFC), tannin content (TC) and radical scavenging activity (measured by 2,2-diphenyl-1-picrylhydrazyl (DPPH), 2,2-azino-bis (3- ethylbenzothiazoline-6-sulphonicacid) (ABTS) and hydrogen peroxide radical scavenging assays was also used. Results: TPC of the extract of roasted maize flour, roasted peanut flour and composite roasted maize-peanut flour ranged from 48.93 to 178.31 mg GAE/100 g, while the TFC was 3.18–25.94 mg CE/100 g and TC (0.22 – 0.73 mg CE/g). The dominant phenolic acid was protocatechuic acid ranged from 13.73 to 1643.54 µg/g. Among the flavonoids, quercetin and catechin were dominant. The extracts of the free soluble fraction exhibited 23.88 – 81.52 %, 49.59 – 85.17 % and 0.58 -5.13 µmol AAE/g of DPPH, hydrogen peroxide and ABTS radical scavenging abilities respectively. Conclusion: Maize–peanut product showed potential ability in contributing to alleviating radical induced oxidative stress.
Figures
PreviousNext
Article
Open Access November 10, 2023

Bioremediation of Heavy Metals in Crude Oil-Contaminated Utisol, Using Nutrient Formulate Produced from Jatropha tanjorensis Leaf Extract

Abstract This work evaluated the bioremediation potential of Jatropha tanjorensis leaf extract at different masses (250g, 500g and 750g) over a 40-day period. To achieve this, crude oil contamination of sandy loam soil was stimulated in twelve plastic reactors containing fixed masses of soil (4kg each) of topsoil homogenized with 500g of Bonny light crude oil. The Jatropha tanjorensis leaves were cultivated, rinsed with distilled water, blended, and purified by filtration. The leaf extract was applied at the stated concentrations including a control reactor (without leaf extract). The plastics reactors were kept in an open air shielded away from rainfall. The physicochemical characteristics determined were particle size distribution (PSD), potential of hydrogen (pH), electrical conductivity (EC), organic matter (OM), organic carbon (OC), selected heavy metals (Cr, Cd, Zn, Pb) and sample management were all in line with standard procedure. After 40 days of treatment, results obtained showed that plastic reactor with 750g of leaf extract produced the highest amount of cadmium reduction of 97% (from an initial of and there was significant difference among treatment (P < 0.05). The sequence of reduction among treatment was 750g > 500g > 250g of the leaf extract. Chromium, Lead and zinc followed similar trend. Thus, the Jatropha tanjorensis [...] Read more.
This work evaluated the bioremediation potential of Jatropha tanjorensis leaf extract at different masses (250g, 500g and 750g) over a 40-day period. To achieve this, crude oil contamination of sandy loam soil was stimulated in twelve plastic reactors containing fixed masses of soil (4kg each) of topsoil homogenized with 500g of Bonny light crude oil. The Jatropha tanjorensis leaves were cultivated, rinsed with distilled water, blended, and purified by filtration. The leaf extract was applied at the stated concentrations including a control reactor (without leaf extract). The plastics reactors were kept in an open air shielded away from rainfall. The physicochemical characteristics determined were particle size distribution (PSD), potential of hydrogen (pH), electrical conductivity (EC), organic matter (OM), organic carbon (OC), selected heavy metals (Cr, Cd, Zn, Pb) and sample management were all in line with standard procedure. After 40 days of treatment, results obtained showed that plastic reactor with 750g of leaf extract produced the highest amount of cadmium reduction of 97% (from an initial of and there was significant difference among treatment (P < 0.05). The sequence of reduction among treatment was 750g > 500g > 250g of the leaf extract. Chromium, Lead and zinc followed similar trend. Thus, the Jatropha tanjorensis leaf extract has the potential to ameliorate crude oil-contaminated soil.
Figures
PreviousNext
Article
Open Access November 03, 2023

Mathematical Modeling of the Price Volatility of Maize and Sorghum between 1960 and 2022

Abstract The price of grains like maize and sorghum is subject to significant fluctuations, which can have a significant impact on a country's economy and food security. The aim of the study is to model sorghum and maize price volatility in Nigeria. The data utilized in the study was extracted from World Bank Commodity Price Data (WBCPD), 2022. The data consists of monthly prices in nominal US dollars for [...] Read more.
The price of grains like maize and sorghum is subject to significant fluctuations, which can have a significant impact on a country's economy and food security. The aim of the study is to model sorghum and maize price volatility in Nigeria. The data utilized in the study was extracted from World Bank Commodity Price Data (WBCPD), 2022. The data consists of monthly prices in nominal US dollars for maize and sorghum from January 1960 – August 2022. The Autoregressive Conditional Heteroskedasticity (ARCH) and Generalized Autoregressive Conditional Heteroskedasticity (GARCH) models were utilized for capturing the two-grain price volatility. Two types of conditional heteroscedastic models exist, the first group uses exact functions to control the evolution of , while the second group describes with stochastic equations. It is inferred from the result that inherent uncertainties and fluctuations existed in the prices of maize and sorghum in Nigeria which implies that the price volatility is positive and statistically significant suggesting that historical information and past shocks play a crucial role in determining the volatility observed in the grains. It is recommended that the ARCH, GARCH, EGARCH, TGARCH, PARCH, CGARCH, and IGARCH models should be employed for modeling and managing the volatility of maize and sorghum prices in Nigeria. These models have shown effectiveness in capturing different aspects of volatility, including the impact of past shocks, conditional volatility, asymmetry, and other relevant factors.
Figures
PreviousNext
Article
Open Access November 01, 2023

Individual Wave Component Signal Modeling, Parameters Extraction, and Analysis

Abstract The accurate estimation of Individual Wave Components (IWC) is crucial for automated diagnosis of the human digestive system in a clinical setting. However, this process can be challenging due to signal contamination by other signal sources in the body, such as the lungs and heart, as well as environmental noise. To address this issue, various denoising techniques are commonly employed in bowel [...] Read more.
The accurate estimation of Individual Wave Components (IWC) is crucial for automated diagnosis of the human digestive system in a clinical setting. However, this process can be challenging due to signal contamination by other signal sources in the body, such as the lungs and heart, as well as environmental noise. To address this issue, various denoising techniques are commonly employed in bowel sound signal processing. While denoising is important, it can increase computational complexity, making it challenging for portable devices. Therefore, signal processing algorithms often require a trade-off between fidelity and computational complexity. This study aims to evaluate an IWC parameter extraction algorithm that was previously developed and reconstruct the IWC without denoising using synthetic and clinical data. To that end, the role of a reliable model in creating synthetic data is paramount. The rigorous testing of the algorithm is limited by the availability of quality and quantity recorded data. To overcome this challenge, a mathematical model has been proposed to generate synthetic bowel sound data that can be used to test new algorithms. The proposed algorithm’s robust performance is evaluated using both synthetic and clinically recorded data. We perform time-frequency analysis of original and reconstructed bowel sound signals in various digestive system states and characterize the performance using Monte Carlo simulation when denoising is not applied. Overall, our study presents a promising algorithm for accurate IWC estimation that can be useful for predicting anomalies in the digestive system.
Figures
PreviousNext
Article
Open Access October 07, 2023

A Systematic Review of Observational Studies Focusing on Impact of Telehealth Consultation in Osteoporosis Management during the Pandemic

Abstract Background: The COVID-19 pandemic disrupted routine osteoporosis care due to clinic closures and limited in-person consultations. Telehealth emerged as an alternative model enabling remote care delivery and monitoring. However, previous reviews on telehealth either did not include the pandemic period or had a limited focus in scope. Evidence synthesized specifically for osteoporosis care [...] Read more.
Background: The COVID-19 pandemic disrupted routine osteoporosis care due to clinic closures and limited in-person consultations. Telehealth emerged as an alternative model enabling remote care delivery and monitoring. However, previous reviews on telehealth either did not include the pandemic period or had a limited focus in scope. Evidence synthesized specifically for osteoporosis care during the pandemic is needed but lacking. Methods: We systematically searched PubMed, MEDLINE, EMBASE, PsycINFO, Web of Science, and CINAHL for studies on telehealth for osteoporosis published between January 2021 and March 2023. Five studies met the inclusion criteria of: osteoporosis population, telehealth intervention, and COVID-19 pandemic timeframe. Data was extracted on study characteristics, COVID-19 outcomes, osteoporosis status, telehealth purpose, patient satisfaction, and clinical outcomes. Result: The five studies showed telehealth was used for monitoring data, delivering test results, adjusting medications, and assessments. Osteoporosis prevalence among telehealth users ranged 30-100%. High patient satisfaction was reported with telehealth versus in-person care. No major differences occurred in medication delays or fractures between telehealth and in-person groups. Conclusion: This review found telehealth enables effective osteoporosis care and monitoring during the pandemic, with high patient and provider satisfaction. However, more robust randomized controlled trials are needed to establish stronger evidence around telehealth's impacts on clinical osteoporosis outcomes. Implications: Though promising, further high-quality studies will help clarify telehealth's role in improving osteoporosis care and outcomes. Findings inform guidelines on integrating telehealth into routine management. Evidence on user perspectives optimizes telehealth implementation policies.
Figures
PreviousNext
Systematic Review
Open Access November 30, 2022

A Review of Application of LiDAR and Geospatial Modeling for Detection of Buildings Using Artificial Intelligence Approaches

Abstract Today, the presentation of a three-dimensional model of real-world features is very important and widely used and has attracted the attention of researchers in various fields, including surveying and spatial information systems, and those interested in the three-dimensional reconstruction of buildings. The building is the key part of the information in a three-dimensional city model, so extracting [...] Read more.
Today, the presentation of a three-dimensional model of real-world features is very important and widely used and has attracted the attention of researchers in various fields, including surveying and spatial information systems, and those interested in the three-dimensional reconstruction of buildings. The building is the key part of the information in a three-dimensional city model, so extracting and modeling buildings from remote sensing data is an important step in building a digital model of a city. LiDAR technology due to its ability to map in all three modes of one-dimensional, two-dimensional, and three-dimensional is a suitable solution to provide hyperspectral and comprehensive images of the building in an urban environment. In this review article, a comprehensive review of the methods used in identifying buildings from the past to the present and appropriate solutions for the future is discussed.
Figures
PreviousNext
Review Article
Open Access August 24, 2022

Epidemiological and Clinical Profile of Deaths due to COVID-19 among Hospitalized Patients in Sidama Region, Ethiopia

Abstract Novel corona virus disease (COVID-19) pandemic, which started in China's Hubei province in 2019, has caused a significant loss of human lives globally. This study describes the epidemiologic and clinical profiles of COVID-19 related deaths among patients admitted to treatment centers in Sidama region, Ethiopia. A cross-sectional study of 186 in hospital COVID-19 related deaths that occurred from [...] Read more.
Novel corona virus disease (COVID-19) pandemic, which started in China's Hubei province in 2019, has caused a significant loss of human lives globally. This study describes the epidemiologic and clinical profiles of COVID-19 related deaths among patients admitted to treatment centers in Sidama region, Ethiopia. A cross-sectional study of 186 in hospital COVID-19 related deaths that occurred from July 2020 to December 2021 in Sidama region were analyzed. Data was extracted from regional emergency operation center death report. Data was entered using Epidata v3.1 and analysis was done using SPSS v.20. Categorical data was summarized using frequency and percentage while continuous data was summarized using median and interquartile range. Association between variables was assessed using chi-square test. More than two-third of the deceased patients were male (135; 72.6%) and median age at death was 60. The majority of deaths (151; 81.1%) occurred in 2021, while April 2021 had the highest death records. Cough and shortness of breath were the main presenting symptoms occurring in 89.2% and 85.5% of deceased patients respectively. Most of the COVID-19 related deaths (64.5%) had associated comorbidities. Diabetes (50%) and Hypertension (39.2%) were the most prevalent comorbidities. Significant proportion of patients (74.73%) presented on severe end of disease spectrum (critical/ severe). Of the deceased patients, around two-third required Intensive care unit (ICU) admission and 111 of them were put on mechanical ventilator. Moreover, the median ICU stay was 4 days. Around half of the death (48.4%) occurred in the first 5 days. The median survival time from symptom onset was 11.5 days with most (43.5%) of the deaths occurring within the first 14 days of symptom onset. Age category was significantly associated with the number of days from onset to death (p=0.006). The case fatality rate was 1.87% which is lower than national and global reports. Unlike previous studies, the prevalence of asthma among deceased patients was low and there were no patients with documented COPD.
Figures
PreviousNext
Article
Open Access March 23, 2022

Green Synthesis and Characterization of Cobalt, Iron and Copper Nanoparticles Derived from the Stem-Bark Extract of Khaya Senegalensis (Mahogany) and Its Antimirobial Activity

Abstract During the past few decades, many of the synthetic chemicals are able to produce nanoparticles and nanoclusters, although these chemicals primarily act as reducing and capping agents, they are very toxic and hazardous and make the nanoparticles biologically incompatible. Thus there is need for green chemistry that includes a clean, non-toxic and environmental friendly method of nanoparticles [...] Read more.
During the past few decades, many of the synthetic chemicals are able to produce nanoparticles and nanoclusters, although these chemicals primarily act as reducing and capping agents, they are very toxic and hazardous and make the nanoparticles biologically incompatible. Thus there is need for green chemistry that includes a clean, non-toxic and environmental friendly method of nanoparticles synthesis. Cobalt, iron and copper nanoparticles were synthesized using the stem-bark extract of khayasenegalensis (mahogany) where cobalt chloride (CoCl2 6H2O), ferric chloride (FeCl2), and copper sulphate (CuSO4 H2O) were used as the metal precursor respectively. The change in color from light brown to dark brown indicates the formation of cobalt nanoparticles, from light brown to dark green indicates the formation of copper nanoparticles and also the change in color from light brown to a dark color indicates formation of iron nanoparticles. The nanoparticles were further characterized using UV visible spectroscopy, FTIR, and SEM. The UV result for CoNPs showed the highest peak at 500nm and both FeNPs and CuNPs showed the highest peak at 300nm. The FTIR results for all the nanoparticles showed the presence of Alkaloids and triterpenes. Also the SEM result showed spherical granular, partially dispersed and monodispersed morphology for CoNPs, FeNPs and CuNPs respectively. Moreover, the antibacterial activity of the synthesized NPs when tested against two gram positive bacteria and two gram negative bacteria was evaluated and good results were obtained. The antifungal activity when tested against two fungi showed a very good result.
Figures
PreviousNext
Article
Open Access September 04, 2025

Evidence-Based Protocols for the Prevention and Treatment of Prosthetic Joint Infection in Total Hip Arthroplasty: A Systematic Review

Abstract Objective: This systematic review aimed to identify, synthesize, and critically analyze the available evidence on clinical protocols used for the prevention and treatment of prosthetic joint infection (PJI) in total hip arthroplasty (THA), based on studies published between 2000 and 2025. Methods: The review was conducted according to PRISMA guidelines. Electronic searches were performed in PubMed (MEDLINE), Scopus, Web of Science, and Embase between January and April 2025. Eligible studies included clinical trials, cohort studies, case-control studies, systematic reviews, and meta-analyses published in English that addressed either preventive or therapeutic strategies for PJI in THA. Study selection, data extraction, and quality assessment were carried out independently by two reviewers. Due to the heterogeneity of the included studies, a qualitative synthesis was performed. Results: A total of 32 studies were included. Preventive measures identified in the literature comprised combined antibiotic prophylaxis (cefazolin and gentamicin), multimodal perioperative protocols such as ACERTO, nasal decolonization for Staphylococcus aureus [...] Read more.
Objective: This systematic review aimed to identify, synthesize, and critically analyze the available evidence on clinical protocols used for the prevention and treatment of prosthetic joint infection (PJI) in total hip arthroplasty (THA), based on studies published between 2000 and 2025. Methods: The review was conducted according to PRISMA guidelines. Electronic searches were performed in PubMed (MEDLINE), Scopus, Web of Science, and Embase between January and April 2025. Eligible studies included clinical trials, cohort studies, case-control studies, systematic reviews, and meta-analyses published in English that addressed either preventive or therapeutic strategies for PJI in THA. Study selection, data extraction, and quality assessment were carried out independently by two reviewers. Due to the heterogeneity of the included studies, a qualitative synthesis was performed. Results: A total of 32 studies were included. Preventive measures identified in the literature comprised combined antibiotic prophylaxis (cefazolin and gentamicin), multimodal perioperative protocols such as ACERTO, nasal decolonization for Staphylococcus aureus, silver-impregnated dressings, and structured post-discharge surveillance. Treatment strategies included DAIR (Debridement, Antibiotics, and Implant Retention), the DAPRI technique, one-stage and two-stage revision surgeries, muscle flap reconstructions, and protocols without spacers. These interventions were associated with significantly reduced infection rates and improved clinical outcomes when applied appropriately and in accordance with patient-specific factors. Conclusion: Effective prevention and treatment of PJI in total hip arthroplasty require a systematic and evidence-based approach. Integrated protocols—spanning preoperative optimization, meticulous intraoperative techniques, and rigorous postoperative monitoring—have proven effective in reducing infection incidence. In cases of established infection, surgical management must be tailored to the timing of infection, microbial profile, and host conditions. Two-stage revision remains the gold standard for complex infections, while one-stage revision and emerging techniques like DAPRI offer promising results in selected cases. This review contributes to the standardization of clinical practice and supports improved patient outcomes.
Systematic Review
Open Access January 23, 2025

Brain-Wide Resting-State Functional Connectivity Partially Mediates Socioeconomic Disparities in Children's Cardiometabolic Health

Abstract Background: Although some neural mechanisms underlying socioeconomic status (SES) disparities are known, the role of brain-wide resting-state functional connectivity in these effects remains less understood. Aim: This study aims to identify brain-wide resting-state functional connectivity signatures that may mediate the effects of SES on body mass index (BMI) and blood pressure in [...] Read more.
Background: Although some neural mechanisms underlying socioeconomic status (SES) disparities are known, the role of brain-wide resting-state functional connectivity in these effects remains less understood. Aim: This study aims to identify brain-wide resting-state functional connectivity signatures that may mediate the effects of SES on body mass index (BMI) and blood pressure in children, using data from the Adolescent Brain Cognitive Development (ABCD) study. Methods: Data were drawn from the ABCD study, a large, diverse cohort of children aged 9-10. Pre-processed resting-state functional MRI data were used, and factor analysis was conducted to extract a whole-brain connectivity factor. The first factor, capturing the greatest variance in brain-wide resting-state connectivity, was selected for further analysis in a structural equation model (SEM). This connectivity factor was tested as a potential mediator of the relationship between SES (measured by parental education, family income, and neighborhood characteristics) and two indicators of cardiometabolic health: BMI and systolic blood pressure. Results: Factor analysis revealed a robust first factor that accounted for a significant proportion of variance in brain-wide resting-state functional connectivity. This factor was significantly associated with SES, indicating that children from lower SES backgrounds exhibited distinct connectivity patterns. Additionally, the factor was linked to both BMI and systolic blood pressure, suggesting its relevance to cardiometabolic health. Mediation analysis showed that this connectivity factor partially mediated the relationship between SES and both BMI and systolic blood pressure. Conclusions: Brain-wide functional connectivity may be a mediator of SES effects on BMI and blood pressure in children. The first connectivity factor provides a promising neural signature linking SES with cardiometabolic risk. Comprehensive brain-wide approaches to functional connectivity may offer valuable insights into how social determinants of health shape neural and physical development in childhood.
Figures
PreviousNext
Article
Open Access July 16, 2024

Management of Saltwater Intrusion in Coastal Aquifers: A Review and Case Studies from Egypt

Abstract Groundwater is undeniably crucial to people's lives, particularly in coastal regions. Therefore, it is imperative to address this vital water source strategically and implement a management plan to maintain its optimal state. The salinization of groundwater poses a significant challenge for coastal communities, stemming from factors like excessive groundwater extraction from coastal aquifers, [...] Read more.
Groundwater is undeniably crucial to people's lives, particularly in coastal regions. Therefore, it is imperative to address this vital water source strategically and implement a management plan to maintain its optimal state. The salinization of groundwater poses a significant challenge for coastal communities, stemming from factors like excessive groundwater extraction from coastal aquifers, reduced recharge, rising sea levels, climate change, and other causes. Saltwater intrusion (SWI) is a prevalent issue that needs attention, as it significantly threatens groundwater quantity and quality. SWI happens when saline water infiltrates coastal aquifers, contaminating freshwater supplies. This review article aims to define SWI, explore its causes and influencing factors, and discuss various monitoring techniques. Additionally, it examines different modeling methods and management tools, including remote sensing, field surveys, modeling approaches, and optimization techniques. To mitigate the adverse effects of SWI, several control measures are outlined, along with their pros and cons. The final section reviews previous SWI studies and case studies from the Nile Delta, Sinai Peninsula, and North-West coast in Egypt. These studies offer suggestions, adaptations, and mitigation measures for future research.
Figures
PreviousNext
Review Article
Open Access June 28, 2024

Nigeria Exchange Rate Volatility: A Comparative Study of Recurrent Neural Network LSTM and Exponential Generalized Autoregressive Conditional Heteroskedasticity Models

Abstract Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long [...] Read more.
Business merchants and investors in Nigeria are interested in the foreign exchange volatility forecasting accuracy performance because they need information on how volatile the exchange rate will be in the future. In the paper, we compared Exponential Generalized Autoregressive Conditional Heteroskedasticity with order p=1 and q= 1, (EGARCH (1,1)) and Recurrent Neural Network (RNN) based on long short term memory (LSTM) model with the combinations of p = 10 and q = 1 layers to model the volatility of Nigerian exchange rates. Our goal is to determine the preferred model for predicting Nigeria’s Naira exchange rate volatility with Euro, Pounds and US Dollars. The dataset of monthly exchange rates of the Nigerian Naira to US dollar, Euro and Pound Sterling for the period December 2001 – August 2023 was extracted from the Central Bank of Nigeria Statistical Bulletin. The model efficiency and performance was measured with the Mean Squared Error (MSE) criteria. The results indicated that the Nigeria exchange rate volatility is asymmetric, and leverage effects are evident in the results of the EGARCH (1, 1) model. It was observed also that there is a steady increase in the Nigeria Naira exchange rate with the euro, pounds sterling and US dollar from 2016 to its highest peak in 2023. Result of the comparative analysis indicated that, EGARCH (1,1) performed better than the LSTM model because it provided a smaller MSE values of 224.7, 231.3 and 138.5 for euros, pounds sterling and US Dollars respectively.
Figures
PreviousNext
Article
Open Access August 09, 2023

Anti-Cancer and Anti-Fungal Activities of Calotropis procera: a Narrative Review

Abstract Calotropis procera, a medicinally important plant found in Asia, was explored for its anticancer and antibacterial properties in this study. The leaves of C. procera were extracted using methanol and FTIR and UV-VIS spectrophotometry were used to characterize them. Using the MTT assay and the disc diffusion test, the extract was examined for anticancer activity against the MCF7 breast cancer cell [...] Read more.
Calotropis procera, a medicinally important plant found in Asia, was explored for its anticancer and antibacterial properties in this study. The leaves of C. procera were extracted using methanol and FTIR and UV-VIS spectrophotometry were used to characterize them. Using the MTT assay and the disc diffusion test, the extract was examined for anticancer activity against the MCF7 breast cancer cell line and antibacterial activity against methicillin-resistant Staphylococcus aureus (MRSA). The methanolic fraction of C. procera was found to be efficient against the MCF7 cell line and dramatically suppressed MRSA growth. The metabolic fraction of C. procera leaves is important in suppressing the growth of the MCF7 cell line, and it has the potential to be an effective antibacterial agent, according to our findings. The implications of Calotropis procera for all healthcare professionals including oncologists, physicians, pharmacists, nurses, and nutritional therapists are significant. With the increasing incidence of cancer and antibiotic-resistant bacterial infections, there is a growing need for new, effective, and safer herbal treatments.
Review Article
Open Access July 26, 2023

Compassion Fatigue in Oncology Nurses: An Integrative Review

Abstract Oncology nurses are more likely to get compassion fatigue (CF) than nurses in other fields because of the emotional stress and poor outlook of cancer patients. Because of this, the care might not be very good, the job might not be very satisfying, and there is a good chance that the patient's pain won't be noticed. Aim. To synthesize empirical evidence on compassion fatigue in order to [...] Read more.
Oncology nurses are more likely to get compassion fatigue (CF) than nurses in other fields because of the emotional stress and poor outlook of cancer patients. Because of this, the care might not be very good, the job might not be very satisfying, and there is a good chance that the patient's pain won't be noticed. Aim. To synthesize empirical evidence on compassion fatigue in order to extract the common, central, and fundamental elements that may improve nursing care. Design. An integrative review Results. Fifteen (15) studies met the eligibility criteria wherein five themes emerged. These are the level of compassion fatigue among oncology nurses, the oncology nurses' perspectives on compassion fatigue, precipitating factors leading to CF with 2 subthemes (work environment and a feeling of lack of support), the influence of compassion fatigue on the personal lives and general well-being of cancer nurses, and the consequences on the quality of oncology nurses' professional lives at work. Conclusion. CF is a significant problem for nurses who work in specialized areas such as cancer units, demonstrated as a basic incapacity to nurture others. The integration of studies provides evidence of clinical practice application which can provide better outcomes and improve nursing care. Implications for Practice. The findings provide understanding into healthcare practice on how to avoid compassion fatigue. Clinical management approaches that can mitigate compassion fatigue and its negative repercussions are presented, as well as the formation of peer support groups that have the ability to ameliorate CF.
Figures
PreviousNext
Review Article
Open Access May 14, 2023

An Assessment of Inclusive Education Experiences of Teacher-Trainees with Sensory Impairment in Colleges of Education in Ghana

Abstract Effective Inclusive education experiences can be built through structured interventions. The purpose of the study was to assess the impact of inclusive education experiences on teacher-trainees with sensory impairment in the Ghana Colleges of Educations of Ghana. The study was based on pragmatist philosophy. The study adopted convergent parallel mixed-methods approach. The population involved all [...] Read more.
Effective Inclusive education experiences can be built through structured interventions. The purpose of the study was to assess the impact of inclusive education experiences on teacher-trainees with sensory impairment in the Ghana Colleges of Educations of Ghana. The study was based on pragmatist philosophy. The study adopted convergent parallel mixed-methods approach. The population involved all 66 students with sensory (visual and hearing) impaired in the three (3) CoEs (PCE, Akropong Akwapim, WESCO, Kumasi and NJA, Wa) that practice inclusive education (IE) during the 2018/19 academic year. Purposive and census sampling techniques were used to select the three (3) colleges of education and sixty-six students for the study. The main instruments for data collection were questionnaire and focus group discussion. The quantitative data items were coded for input into the Statistical Product and Service Solutions (SPSS) version 23 software and analysed using means and standard deviations. The qualitative extracts collected into themes that were coded, analysed and interpreted. The study revealed that teacher-trainees had varied experiences on campus, while they felt welcomed into the inclusive institution; they also felt the Colleges were not well prepared to meet their needs. The physical environment was not conducive for the VI on campus. It is recommended that, College authorities should work with the MoE and agencies concerned with disability issues in the society to provide comfortable environment on College campuses for TTSI. It is also recommended that, providing a comfortable environment should include facilities and resources needed for the TTSI to learn effectively. It also involves physical arrangement of the campus environment. The TTSI, regardless of their disabilities, should be provided with an environment where their movement, their studies, their interactions with their peers and tutors are made easier to help them graduate successfully.
Figures
PreviousNext
Article
Open Access February 02, 2023

Quantifying 64 drugs, illicit substances, and D- and L- isomers in human oral fluid with liquid-liquid extraction

Abstract Although human oral fluid has become more routine for quantitative drug detection in pain management, detecting a large scope of medications and substances is costly and technically challenging for laboratories. This paper presents a quantitative assay for 64 pain medications, illicit substances, and drug metabolites in human oral fluid. The novelty of this assay is that it was developed on an [...] Read more.
Although human oral fluid has become more routine for quantitative drug detection in pain management, detecting a large scope of medications and substances is costly and technically challenging for laboratories. This paper presents a quantitative assay for 64 pain medications, illicit substances, and drug metabolites in human oral fluid. The novelty of this assay is that it was developed on an older model AB SCIEX 4000 instrument and renders obscure the need for more technical and expensive laboratory equipment. This method includes addition of internal standard and a 2-step liquid-liquid extraction and dry-down step to concentrate and clean the samples. The samples were suspended in 50% MeOH in water and separation and detection was accomplished using triple quadrupole mass spectrometry (LC-MS/MS). Separation was achieved using reverse-phase liquid chromatography with detection by LC-MS/MS. A second injection was done in negative mode to determine THC-COOH concentration as an indicator of THC. An aliquot of the (already) extracted samples was analyzed for D- and L- isomers of amphetamine and methamphetamine using a chiral column. The standard curve spanned from 5 to 2000 ng/mL for most of the analytes (1 to 2000 ng/mL for fentanyl and THC-COOH) and up to 1000 ng/mL for 13 analytes. Pregabalin and gabapentin ranged from 25 to 2000 ng/mL. The result is a low-cost method for the sensitive detection of a wide-ranging oral fluid menu for pain management. This assay has a high sensitivity, and good precision and accuracy for all analytes with an older model mass spectrometer.
Article
Open Access January 01, 2023

Analysis of D- and L- Isomers of (Meth)amphetamine in Human K2EDTA Plasma

Abstract Methamphetamine and its metabolite amphetamine are frequently abused drugs. Whether obtained legally or from clandestine laboratories it is of relevance to determine the chiral makeup of these drugs for investigative purpose. Although urine and oral fluid matrices are commonly offered, less available to independent laboratories are techniques to verify dextro (D-) or levo (L-) (meth)amphetamine [...] Read more.
Methamphetamine and its metabolite amphetamine are frequently abused drugs. Whether obtained legally or from clandestine laboratories it is of relevance to determine the chiral makeup of these drugs for investigative purpose. Although urine and oral fluid matrices are commonly offered, less available to independent laboratories are techniques to verify dextro (D-) or levo (L-) (meth)amphetamine from human K2EDTA plasma. This paper outlines the development and validation of a method that includes the addition of internal standard and a two-step liquid-liquid extraction to remove the analytes from human K2EDTA plasma by triple quadrupole mass spectrometry (LC-MS/MS). The assay was validated according to the United States Food and Drug Administration and College of American Pathologists guidelines, including assessment of the following parameters in plasma validation samples: linear range, limit of detection, lower limit of quantitation, matrix effects, inter- and intra-day assay precision and accuracy, carry over, linearity of dilution, matrix effects and stability. The outcome is a validated and reliable method for the determination of D- and L- isomer concentration of meth(amphetamine) human plasma samples that can be easily adopted by independent clinical laboratories.
Article
Open Access December 08, 2022

Antibacterial Activity of Phyllanthus Amarus (Schum and Thonn) Extract Against Salmonella Typhi Causative Agent of Typhoid Fever

Abstract The study was conducted to assess the antibacterial activity of Phyllanthus amarus (Schum and Thonn) extract against Salmonella typhi causative agent of typhoid fever at the laboratories of the Departments of Chemistry and Theoretical and Applied Biology of the College of Science, Kwame Nkrumah University of Science and Technology, Kumasi. The objectives were to determine the highest yield of crude extract of P. amarus using different proportions of water to ethanol and to determine the sensitivity of Salmonella typhi to these. Three different extraction procedures were carried out. In the first procedure, seven extraction setups each containing different proportions of the two extract (water and ethanol) were used with 10g of the plant sample. In the second procedure, eight setups were used for the two solvents. Ten grams of both fresh and dry plant sample were extracted in two different 200ml of water and in another two different 200ml of water; 20g of both fresh and dry plant sample were again extracted. The same procedure was repeated using ethanol as the solvent. In the third procedure, 10g each of fresh plant sample were boiled in 100ml and 200ml of water for 30 minutes. A sensitivity test to determine the zones of inhibition for the various plant extracts was done on Salmonella typhi isolated from human. Results from the crude yield of P. amarus using water only had the highest crude yield of 2.57g, followed by ethanol only which was 2.52g. The sensitivity studies conducted on the fresh P. amarus indicated that aqueous extract of P. amarus inhibited S. typhi to a zone of 5.00mm in 10g/200ml and 7.17mm in 20g/200ml. Ethanol extract also recorded an inhibition zone of 2.67mm and 5.33mm in 10g/200ml and 20g/200ml respectively. Again, sensitivity studies using dry P. amarus samples showed that the aqueous extracts recorded a zone of inhibition of 7.33mm in 10g/200ml and 13.50mm in 20g/200ml. Also ethanol extracts also recorded an inhibition zone of 6.83mm in 10g/200ml and 10.50mm in 20g/200ml. Significant differences were observed among the extracts and the control in both 10g/200ml and 20g/200ml concentrations (P<0.05). Aqueous and ethanol extracts of P. amarus proved inhibitory to S. typhi [...] Read more.
The study was conducted to assess the antibacterial activity of Phyllanthus amarus (Schum and Thonn) extract against Salmonella typhi causative agent of typhoid fever at the laboratories of the Departments of Chemistry and Theoretical and Applied Biology of the College of Science, Kwame Nkrumah University of Science and Technology, Kumasi. The objectives were to determine the highest yield of crude extract of P. amarus using different proportions of water to ethanol and to determine the sensitivity of Salmonella typhi to these. Three different extraction procedures were carried out. In the first procedure, seven extraction setups each containing different proportions of the two extract (water and ethanol) were used with 10g of the plant sample. In the second procedure, eight setups were used for the two solvents. Ten grams of both fresh and dry plant sample were extracted in two different 200ml of water and in another two different 200ml of water; 20g of both fresh and dry plant sample were again extracted. The same procedure was repeated using ethanol as the solvent. In the third procedure, 10g each of fresh plant sample were boiled in 100ml and 200ml of water for 30 minutes. A sensitivity test to determine the zones of inhibition for the various plant extracts was done on Salmonella typhi isolated from human. Results from the crude yield of P. amarus using water only had the highest crude yield of 2.57g, followed by ethanol only which was 2.52g. The sensitivity studies conducted on the fresh P. amarus indicated that aqueous extract of P. amarus inhibited S. typhi to a zone of 5.00mm in 10g/200ml and 7.17mm in 20g/200ml. Ethanol extract also recorded an inhibition zone of 2.67mm and 5.33mm in 10g/200ml and 20g/200ml respectively. Again, sensitivity studies using dry P. amarus samples showed that the aqueous extracts recorded a zone of inhibition of 7.33mm in 10g/200ml and 13.50mm in 20g/200ml. Also ethanol extracts also recorded an inhibition zone of 6.83mm in 10g/200ml and 10.50mm in 20g/200ml. Significant differences were observed among the extracts and the control in both 10g/200ml and 20g/200ml concentrations (P<0.05). Aqueous and ethanol extracts of P. amarus proved inhibitory to S. typhi.
Figures
PreviousNext
Article
Open Access December 08, 2022

Development of a Competitive Enzyme Immunoassay Technique for the Detection of Peanut Traces in Gluten-free Products

Abstract The aim of this work was to develop a competititve enzymeimmunoassay technique, to detect the presence of traces of peanut in gluten-free products. Specific rabbit polyclonal antiserum against peanut was used as primary antibody. The optimal antigen concentration to be immobilized on the plate and the concentration of primary antibody to be used in competition was determined. The calibration curve [...] Read more.
The aim of this work was to develop a competititve enzymeimmunoassay technique, to detect the presence of traces of peanut in gluten-free products. Specific rabbit polyclonal antiserum against peanut was used as primary antibody. The optimal antigen concentration to be immobilized on the plate and the concentration of primary antibody to be used in competition was determined. The calibration curve was fitted using increasing concentrations of an extract of peanut product. The peanut product was extracted with Tris-HCl buffer 0.0625M with 3% sodium dodecylsulfate (SDS) and 2% sulphite (S) 0,1 M. All validation parameters studied were appropriate. Commercial samples of gluten-free products were analysed with this enzyme immunoassays and a commercial ELISA kit. Significant differences were observed in the quantitative results obtained with both methods; nevertheless the developed enzyme immunoassay could be used as screening method.
Figures
PreviousNext
Article
Open Access November 13, 2022

Effects of Impacted Third Molars Extraction on Periodontal Status of Second Molar and Oral Health-Related Quality of Life

Abstract Objective: The study investigated the impacts of third molar (M3) extraction on periodontal status of adjacent second molar (M2) and oral health-related Quality of life (QoL). Methods: 272 cases with M3 were randomly divided into treatment group and control group, each of 136 cases. Questionnaire survey evaluated demographic features of the subjects. Gingival index (GI), [...] Read more.
Objective: The study investigated the impacts of third molar (M3) extraction on periodontal status of adjacent second molar (M2) and oral health-related Quality of life (QoL). Methods: 272 cases with M3 were randomly divided into treatment group and control group, each of 136 cases. Questionnaire survey evaluated demographic features of the subjects. Gingival index (GI), plaque index (PLI) and conscious symptoms of adjacent teeth of the M3 were detected in both groups at baseline (T0), 1 week (T1), 1 month (T2) and 6 months (T3) after treatment. The impacts of QoL were evaluated using OHIP-14 to measure total scores and various items of OHIP-14 at different observation point. Results: The percentage of swelling and toothache in the treatment group was significantly lower than that in the control group at T1, T2 and T3. Significant differences could be seen in GI and PLI between the treatment and control groups at T2 and T3. The total score of OHIP-14 in T2 and T3 was significantly lower than that in T0 and T1 in the treatment group. The 14 items scores and the percentage of positive reaction in the treatment group were significantly lower than those in the control group at T2 and T3. Conclusion: M3 extraction had significant impacts on the periodontal status of the second molar and quality of life. This study would provide an important basis for the prophylactic removal of M3 in clinical work.
Figures
PreviousNext
Article
Open Access November 10, 2022

Modeling and Forecasting Cryptocurrency Returns and Volatility: An Application of GARCH Models

Abstract The future of e-money is crypocurrencies, it is the decentralize digital and virtual currency that is secured by cryptography. It has become increasingly popular in recent years attracting the attention of the individual, investor, media, academia and governments worldwide. This study aims to model and forecast the volatilities and returns of three top cryptocurrencies, namely; Bitcoin, Ethereum [...] Read more.
The future of e-money is crypocurrencies, it is the decentralize digital and virtual currency that is secured by cryptography. It has become increasingly popular in recent years attracting the attention of the individual, investor, media, academia and governments worldwide. This study aims to model and forecast the volatilities and returns of three top cryptocurrencies, namely; Bitcoin, Ethereum and Binance Coin. The data utilized in the study was extracted from the higher market capitalization at 31st December, 2021 and the data for the period starting from 9th November, 2017 to 31st December 2021. The Generalised Autoregressive conditional heteroscedasticity (GARCH) type models with several distributions were fitted to the three cryptocurrencies dataset with their performances assessed using some model criterion tests. The result shows that the mean of all the returns are positive indicating the fact that the price of this three crptocurrencies increase throughout the period of study. The ARCH-LM test shows that there is no ARCH effect in volatility of Bitcoin and Ethereum but present in Binance Coin. The GARCH model was fitted on Binance Coin, the AIC and log L shows that the CGARCH is the best model for Binance Coin. Automatic forecasting was perform based on the selected ARIMA (2,0,1), ARIMA (0,1,2) and the random walk model which has the lowest AIC for ETH-USD, BNB-USD and BTC-USD respectively. This finding could aid investors in determining a cryptocurrency's unique risk-reward characteristics. The study contributes to a better deployment of investor’s resources and prediction of the future prices the three cryptocurrencies.
Figures
Figure 2 (c)
Figure 4 (b)
Figure 4 (c)
Figure 5 (b)
Figure 5 (c)
PreviousNext
PDF Html Xml
Article
Open Access August 27, 2022

Green Synthesis of Silver Nanoparticles from Various Medicinal Plants

Abstract Chemical solvents are commonly used to prevent microbial growth; dangerous to human health and have limited antibacterial properties. On the other hand, Nanoparticles made of metallic elements (such as copper, silver, and gold) have several uses in the field of biotechnology. Silver nanoparticles are more efficient in their antimicrobial, antibacterial, anti-inflammatory and anti-cancer properties. The current study aimed to determine the green synthesis of silver nanoparticles and their antibacterial activity from the aqueous extracts of leaves of Couroupita guianensis, Punica granatum, Vitex negundo, Cirtrus maxima. [...] Read more.
Chemical solvents are commonly used to prevent microbial growth; dangerous to human health and have limited antibacterial properties. On the other hand, Nanoparticles made of metallic elements (such as copper, silver, and gold) have several uses in the field of biotechnology. Silver nanoparticles are more efficient in their antimicrobial, antibacterial, anti-inflammatory and anti-cancer properties. The current study aimed to determine the green synthesis of silver nanoparticles and their antibacterial activity from the aqueous extracts of leaves of Couroupita guianensis, Punica granatum, Vitex negundo, Cirtrus maxima. AgNPs of plant extracts were prepared using silver nitrate with the respective plant extract. Then they were characterized by FTIR analysis. The respective functional groups in the synthesized silver nanoparticles were confirmed with FTIR Spectra. The antibacterial activities of the synthesized nanoparticle extract were observed by zone of inhibition. From the results, the nanoparticles synthesized from the plants extract could pave a way to formulate a drug to treat microbial infection.
Figures
PreviousNext
Article
Open Access August 22, 2022

Managing Challenges Women Face in Leadership Positions: Carl Rogers' Humanistic Approach

Abstract The purpose of the study was to examine Carl Rogers' humanistic approach to supporting women in leadership positions to make a formed decision on how to manage the challenges they face. A qualitative approach was adopted for the study. The population of the study included fifteen (15) headmistresses and housemistresses in the Senior High Schools in the New Juaben Municipality in the Eastern [...] Read more.
The purpose of the study was to examine Carl Rogers' humanistic approach to supporting women in leadership positions to make a formed decision on how to manage the challenges they face. A qualitative approach was adopted for the study. The population of the study included fifteen (15) headmistresses and housemistresses in the Senior High Schools in the New Juaben Municipality in the Eastern Region. The purposive sampling technique was used to select schools, headmistresses and housemistresses in the schools. The schools selected were Oyoko Methodist Senior High School (OMESS), SDA Senior High School (SEDASS), Ghana Senior High School (GHANASS), Koforidua Technical Institute (KOTECH), Nana Kwaku Boateng Senior High School (OBOSS) and New Juaben Senior High School (NJUASCO). The main instruments used for data collection were a semi-structured interview guide and Reflective dialogue. Data was analysed through the use of the thematic approach. Within-case and across-case analytical technique was used to analyse the qualitative data. This was done through the identification of themes, categories and sub-categories the analytical tool for the qualitative data through interviews and reflective dialogue (RD). Themes that were extracted from the interview corresponding to Carl Rogers' humanistic person-centred) the approach were; inherent potentialities, support, motivation, power relation, INSET, cultural dimension, and guidance and counselling. The study revealed that women face challenges using inadequate school facilities they do their best possible to manage their challenges with the few facilities available, the introduction of Carl Rogers' intervention, and women became more aware of their potential in managing the challenges they face at work in their leadership positions. It is recommended that guidance programmes should be conducted often to inform both teachers and students about the potential of women. It has also emerged that women leaders do not know who they are and therefore they should undergo counselling in order to be self-actualised.
Figures
PreviousNext
Article
Open Access April 25, 2022

Green Synthesis, Characterization and Antimicrobial Potency of Silver Nanoparticles from Psidium guajava Leaf Extract

Abstract In this Research work, Silver Nanoparticles were green synthesized from Psidium guajava leaves and different Characterization techniques including UV-Visible, FT-IR, SEM and XRD were all employed to ascertain the absorption peaks, functional group, surface morphology and crystalline size of the nanoparticles respectively. These nanoparticles green synthesized were applied against four [...] Read more.
In this Research work, Silver Nanoparticles were green synthesized from Psidium guajava leaves and different Characterization techniques including UV-Visible, FT-IR, SEM and XRD were all employed to ascertain the absorption peaks, functional group, surface morphology and crystalline size of the nanoparticles respectively. These nanoparticles green synthesized were applied against four different pathogens namely, S. aureus (gram- positive bacteria), E. coli (gram- negative bacteria), C. albicans (gram- positive fungus) and Aspergillus niger and the investigation showed that the Silver nanoparticles synthesized were potent against the selected pathogens. From the UV-Vis spectral analysis, it was observed that highest absorption peaks appeared at 400nm and 500nm reflecting the surface plasmon resonance of silver Nanoparticles from guava leaves which is characteristic of Silver Nanoparticles. From the FT-IR spectrum of the sample under studied, the peaks 3416.85 cm-1, 2923.51 cm-1, 1618.95 cm-1, 1384.49 cm-1 and 1033.63 cm-1 were observed where the absorption band at 3416.85 cm-1 corresponds to the stretching due to N-H, while the band at 2923.51 cm-1 is associated with C-H stretch of alkane and O-H stretching. The peak at 1618.95 cm-1 shows C=C stretching, 1384.49 cm-1 reveals the existence of C-H bending and 1033.63 cm-1 depicted C-O stretching. The SEM analysis revealed the shape of the nanoparticles as being spherical while XRD result admits that the average size of the green synthesized Ag NPs was 45.5 nm using the Scherer’s formula. Augmentin was used as control at concentration of 300μg/L throughout antimicrobial studies. Different concentrations of 100, 200, 300, 400 and 500μg/L of Silver Nanoparticles were tested against each pathogen. It was discovered that with increase in concentrations of Silver Nanoparticles of all the pathogens, there generally appeared to be increase in inhibition zone. At higher concentration of 500μg/L, the zones of inhibition were in the following order; 22.50 mm, 17.00mm, 15.44mm, and 13.23mm for E. Coli, S. aureus, C. albicans and Aspergillus niger respectively. For each concentration investigated, E. coli, demonstrated higher zone of inhibition as opposed to all other pathogens investigated in this research.
Figures
Figure 2 (b)
Figure 2 (c)
PreviousNext
PDF Html Xml
Article
Open Access December 18, 2021

An Application of Remote Sensing Imagery for Geological Lineaments Extraction over Kaybarkuh Region in East of Iran

Abstract Kaybarkuh (Mount Kaybar) consists of intrusive igneous bodies with two age periods, located in North of Dasht-e-Bayaz left-lateral fault terminal. The spatial and structural analysis of fractures and dike networks may allow for the accurate identification of mineralization zones in the area. This study aims to characterize lineament network in the study area by automatic method using multispectral [...] Read more.
Kaybarkuh (Mount Kaybar) consists of intrusive igneous bodies with two age periods, located in North of Dasht-e-Bayaz left-lateral fault terminal. The spatial and structural analysis of fractures and dike networks may allow for the accurate identification of mineralization zones in the area. This study aims to characterize lineament network in the study area by automatic method using multispectral satellite images from Landsat 8 Operational Land Imager (OLI), visual extraction of lineaments from Landsat-8 and SENTINEL-2 images, and extraction of drainage network as lineament based on digital elevation models (DEMs) and their validation, compared with fault network of the area. The results showed that there is a significant relationship between the trend of studied lines in the region by the three methods mentioned and the overall trend is about N330⁰. This can indicate a tensile regime with a trend perpendicular to the mentioned orientation, which results from the activity of the Dasht-e-Bayaz fault. Finding more evidences requires further studies.
Figures
PreviousNext
Article
Open Access December 18, 2021

Phytochemical Analysis and Evaluation of Bioactivities of Cola acuminata Extracts

Abstract Background: From centuries of evolution, knowledge and technological progress for mankind to one day rediscover nature. Currently, the control of bacterial infections is becoming complex due to the concern of antibiotic resistance, which has been a significant global health problem. The aim was to determine and compare phytochemical constituents and in the in vitro evaluation of antimicrobial and antioxidant activities of aqueous, methanol, acetate, dichloromethane extracts from Cola acuminata nuts grown in the Nord Ubangi Province, DRC. Methods: The nuts of Cola acuminata were harvested in April 2016 at Yakoma city, Nord-Ubangi, DRC. The microscopic features of this species were performed in order to identify specific histological structures. Three bacterial strains notably Staphylococcus aureus ATCC 25923, Escherichia coli ATCC 8739 and Pseudomonas aeroginosa ATCC 9027 were used for the assessment of the antibacterial activity. The qualitative and quantitative phytochemical screening were used for compound identification using different fractions and fractions which presented a good extraction yield was used for further analysis. The antioxidant activity was evaluated using ABTS and DPPH scavenging tests while the antibacterial activity was performed using the diffusion method. Findings: The micrography of C. acuminata revealed the presence of following histological elements of which: fibers, spiral vessels, trichomes, ovoid starch grains, sclerenchyma and the fragments of endosperm. Only the methanol and aqueous extracts presented a good extraction yield. The phytochemistry shows the presence of flavonoids, anthocyanins, terpenes, iridoids and tannins. All fractions showed IC50 values lower than 10 µg/mL in the ABTS test and lower than 100 µg/mL in the DPPH test. The antibacterial activity of this plant was low against the three strains used. Conclusion: Seeing the potency of C. acuminata and different biological activities displayed, further analysis are required in order to identify and purify the active ingredients, to study the toxicity of cell lines in vitro [...] Read more.
Background: From centuries of evolution, knowledge and technological progress for mankind to one day rediscover nature. Currently, the control of bacterial infections is becoming complex due to the concern of antibiotic resistance, which has been a significant global health problem. The aim was to determine and compare phytochemical constituents and in the in vitro evaluation of antimicrobial and antioxidant activities of aqueous, methanol, acetate, dichloromethane extracts from Cola acuminata nuts grown in the Nord Ubangi Province, DRC. Methods: The nuts of Cola acuminata were harvested in April 2016 at Yakoma city, Nord-Ubangi, DRC. The microscopic features of this species were performed in order to identify specific histological structures. Three bacterial strains notably Staphylococcus aureus ATCC 25923, Escherichia coli ATCC 8739 and Pseudomonas aeroginosa ATCC 9027 were used for the assessment of the antibacterial activity. The qualitative and quantitative phytochemical screening were used for compound identification using different fractions and fractions which presented a good extraction yield was used for further analysis. The antioxidant activity was evaluated using ABTS and DPPH scavenging tests while the antibacterial activity was performed using the diffusion method. Findings: The micrography of C. acuminata revealed the presence of following histological elements of which: fibers, spiral vessels, trichomes, ovoid starch grains, sclerenchyma and the fragments of endosperm. Only the methanol and aqueous extracts presented a good extraction yield. The phytochemistry shows the presence of flavonoids, anthocyanins, terpenes, iridoids and tannins. All fractions showed IC50 values lower than 10 µg/mL in the ABTS test and lower than 100 µg/mL in the DPPH test. The antibacterial activity of this plant was low against the three strains used. Conclusion: Seeing the potency of C. acuminata and different biological activities displayed, further analysis are required in order to identify and purify the active ingredients, to study the toxicity of cell lines in vitro, to perform the in vivo experiments and to test for other activities such as the anti-hypoglycemic and anti-inflammatory.
Figures
PreviousNext
Article
Open Access August 09, 2021

Optimization and Prediction of Biodiesel Yield from Moringa Seed Oil and Characterization

Abstract In this study, oil was extracted from Moringa seed using mechanical and solvent methods. To transesterify the oil into biodiesel, factorial design of experiment of 24 was used to obtain different combination factors at different level of reaction temperature, catalyst amount, reaction time and alcohol to oil ratio, giving rise to 48 experimental runs. The oil sample was transesterified [...] Read more.
In this study, oil was extracted from Moringa seed using mechanical and solvent methods. To transesterify the oil into biodiesel, factorial design of experiment of 24 was used to obtain different combination factors at different level of reaction temperature, catalyst amount, reaction time and alcohol to oil ratio, giving rise to 48 experimental runs. The oil sample was transesterified in 48 experimental runs, in each case the biodiesel yield was recorded in percentage. The biodiesel was then characterized according to ASTM test protocol. Factorial design model was developed using Design Expert 7.0, the model generated R of 0.987 and Mean Square Error (MSE) of 5.0453 and was used to predict and optimize biodiesel yield. Artificial Neural Network (ANN) model from MATLAB R2016a was developed using 4 input variables and 30 runs, the remaining 18 runs were tested with the ANN model to predict and compare the biodiesel yield with the experimental biodiesel yield, the model generated R value of 0.99687 and MSE of 3.50804. It was found that solvent method yielded more oil than mechanical method, the biodiesel has good thermo-physical property, optimum biodiesel yield of 91.45 % was obtained at 5:1 alcohol/ oil molar ratio, 18.89 wt% catalyst amounts, 45 minutes reaction time and at 45 reaction temperature. The experimental validation yielded 88.33 % biodiesel. The ANN model adequately predicted the remaining 18 runs with R2 value of 0.99649 and MSE of 4.914243. Both models proved adequate enough to predict biodiesel yield but ANN model proved more adequate.
Figures
PreviousNext
Article
Open Access December 27, 2020

Exploring AI Algorithms for Cancer Classification and Prediction Using Electronic Health Records

Abstract Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer [...] Read more.
Cell division that is not controlled leads to cancer, an incurable condition. An early diagnosis has the potential to lower death rates from breast cancer, the most frequent disease in women worldwide. Imaging studies of the breast may help doctors find the disease and diagnose it. This study explores an effectiveness of DL and ML models in a classification of mammography images for breast cancer detection, utilizing the publicly available CBIS-DDSM dataset, which comprises 5,000 images evenly divided between benign and malignant cases. To improve diagnostic accuracy, models such as Gaussian Naïve Bayes (GNB), CNNs, KNN, and MobileNetV2 were assessed employing performance measures including F1-score, recall, accuracy, and precision. The methodology involved data preprocessing techniques, including transfer learning and feature extraction, followed by data splitting for robust model training and evaluation. Findings indicate that MobileNetV2 achieved a highest accuracy99.4%, significantly outperforming GNB (87.2%), CNN (96.7%), and KNN (91.2%). The outstanding capacity of MobileNetV2 to identify between benign and malignant instances was shown by the investigation, which also made use of confusion matrices and ROC curves to evaluate model performance.
Figures
PreviousNext
Review Article
Open Access October 15, 2022

Big Data and AI/ML in Threat Detection: A New Era of Cybersecurity

Abstract The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even [...] Read more.
The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even under pressure from regulatory boards, have strived to harness the power of data and leverage it to enhance safety and security, maximize performance, and mitigate risks. However, the adversaries themselves have capitalized on the unequal battle of big data and artificial intelligence to inflict widespread chaos. Therefore, the demand for big data analytics and AI/ML for high-fidelity intelligence, surveillance, and reconnaissance is at its highest. Today, in the cybersecurity realm, the detection of adverse incidents poses substantial challenges due to the sheer variety, volume, and velocity of deep packet inspection data. State-of-the-art detection techniques have fallen short of detecting the latest attacks after a big data breach incident. On the other hand, computational intelligence techniques such as machine learning have reignited the search for solutions for diverse monitoring problems. Recent advancements in AI/ML frameworks have the potential to analyze IoT/edge-generated big data in near real-time and assist risk assessment and mitigation through automated threat detection and modeling in the big data and AI/ML domain. Industry best practices and case studies are examined that endeavor to showcase how big data coupled with AI/ML unlocks new dimensions and capabilities in improved vigilance and monitoring, prediction of adverse incidents, intelligent modeling, and future uncertainty quantification by data resampling correction. All of these avenues lead to enhanced robustness, security, safety, and performance of industrial processes, computing, and infrastructures. A view of the future and how the potential threats due to the misuse of new technologies from bandwidth to IoT/edge, blockchain, AI, quantum, and autonomous fields is discussed. Cybersecurity is again playing out at a pace set by adversaries with low entry barriers and debilitating tools. The need for innovative solutions for defense from the emerging threat landscape, harnessing the power of new technologies and collaboration, is emphasized.
Figures
PreviousNext
Article
Open Access December 27, 2021

Leveraging AI in Urban Traffic Management: Addressing Congestion and Traffic Flow with Intelligent Systems

Abstract Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. [...] Read more.
Traffic congestion across the globe is a multimodal problem, intertwining vehicular, pedestrian, and bicycle traffic. The relationship between the multimodal traffic flow is a key factor in understanding urban traffic dynamics. The impact of excessive congestion extends to the excessive cost spent on traffic maintenance, as well as the inherent transportation inefficiency and delayed travel times. From an urban transportation standpoint, an immediate consideration on one hand is monitoring traffic conditions and demand cycles, while on the other hand inducing flow modifications that benefit the traffic network and mitigate congestion. Embedded and centralized control systems that characterize modern traffic management systems extract traffic conditions specific to their regions but lack communication between networks. Moreover, innovative methods are required to provide more accurate up-to-date traffic forecasts that characterize real-world traffic dynamics and facilitate optimal traffic management decisions. In this chapter, we briefly outline the main difficulties and complexities in modeling, managing, and forecasting traffic dynamics. We also compare various conventional and modern Intelligent Transportation Strategies in terms of accuracy and applicability, their performance, and potential opportunities for optimization of multimodal traffic flow and congestion reduction. This chapter introduces various proposed data-driven models and tools employed for traffic flow prediction and management, investigating specific strategies' strengths, weaknesses, and benefits in addressing various real-world traffic management problems. We describe that the design phase of dependable Intelligent Transportation Systems bears unique requirements in terms of the robustness, safety, and response times of their components and the encompassing system model. Furthermore, this architectural blueprint shares similarities with distributed coordinate searching and collective adaptive systems. Town size-independent models induce systemic performance improvements through reconfigurable embedded functionality. These AI techniques feature elaborate anytime planner-engagers ensuring near-optimal performances in an unbiased behavior when the model complexity is varied. Sustainable models minimize congestion during peaks, flooding, and emergency occurrences as they adhere to area-specific regulations. Security-aware and fail-safe traffic management systems relinquish reasonable assurances of persistent operation under various environmental settings, to acknowledge metropolis and complex traffic junctions. The chapter concludes by outlining challenges, research questions, and future research paths in the field of transportation management.
Figures
PreviousNext
Review Article
Open Access December 21, 2016

Advanced Natural Language Processing (NLP) Techniques for Text-Data Based Sentiment Analysis on Social Media

Abstract The field of sentiment analysis is a crucial aspect of natural language processing (NPL) and is essential in discovering the emotional undertones within the text data and, hence, capturing public sentiments over a variety of issues. In this regard, this study suggests a deep learning technique for sentiment categorization on a Twitter dataset that is based on Long Short-Term Memory (LSTM) [...] Read more.
The field of sentiment analysis is a crucial aspect of natural language processing (NPL) and is essential in discovering the emotional undertones within the text data and, hence, capturing public sentiments over a variety of issues. In this regard, this study suggests a deep learning technique for sentiment categorization on a Twitter dataset that is based on Long Short-Term Memory (LSTM) networks. Preprocessing is done comprehensively, feature extraction is done through a bag of words method, and 80-20 data is split using training and testing. The experimental findings demonstrate that the LSTM model outperforms the conventional models, such as SVM and Naïve Bayes, with an F1-score of 99.46%, accuracy of 99.13%, precision of 99.45%, and recall of 99.25%. Additionally, AUC-ROC and PR curves validate the model’s effectiveness. Although, it performs well the model consumes heavy computational resources and longer training time. In summary, the results show that deep learning performs well in sentiment analysis and can be used to social media monitoring, customer feedback evaluation, market sentiment analysis, etc.
Figures
PreviousNext
Review Article
Open Access December 27, 2021

An Analysis of Crime Prediction and Classification Using Data Mining Techniques

Abstract Crime is a serious and widespread problem in their society, thus preventing it is essential. Assignment. A significant number of crimes are committed every day. One tool for dealing with model crime is data mining. Crimes are costly to society in many ways, and they are also a major source of frustration for its members. A major area of machine learning research is crime detection. This paper [...] Read more.
Crime is a serious and widespread problem in their society, thus preventing it is essential. Assignment. A significant number of crimes are committed every day. One tool for dealing with model crime is data mining. Crimes are costly to society in many ways, and they are also a major source of frustration for its members. A major area of machine learning research is crime detection. This paper analyzes crime prediction and classification using data mining techniques on a crime dataset spanning 2006 to 2016. This approach begins with cleaning and extracting features from raw data for data preparation. Then, machine learning and deep learning models, including RNN-LSTM, ARIMA, and Linear Regression, are applied. The performance of these models is evaluated using metrics like Root Mean Squared Error (RMSE) and Mean Absolute Percentage Error (MAPE). The RNN-LSTM model achieved the lowest RMSE of 18.42, demonstrating superior predictive accuracy among the evaluated models. Data visualization techniques further unveiled crime patterns, offering actionable insights to prevent crime.
Figures
PreviousNext
Article
Open Access December 27, 2020

Optimizing Unclaimed Property Management through Cloud-Enabled AI and Integrated IT Infrastructures

Abstract With unclaimed property assets reaching record levels, businesses have become, in some cases, overwhelmed and hamstrung by stagnant, unoptimized processes. That sentiment is compounded by ever-evolving regulatory changes, resulting in organizations struggling to hit compliance deadlines while delivering an optimal claimant experience. Often, early systems had periods of short-term success but are [...] Read more.
With unclaimed property assets reaching record levels, businesses have become, in some cases, overwhelmed and hamstrung by stagnant, unoptimized processes. That sentiment is compounded by ever-evolving regulatory changes, resulting in organizations struggling to hit compliance deadlines while delivering an optimal claimant experience. Often, early systems had periods of short-term success but are on the verge of obsolescence, resulting in stressed workflows and cumbersome integrations. Deploying an integrated IT infrastructure, supported by cloud-enabled AI, represents the quickest path to modernizing unclaimed property management. A fully integrated IT infrastructure is crucial to optimize the management of unclaimed property [1]. When lone solutions exist across an organization, companies miss out on automation opportunities generated through the interconnectedness of systems and data. AI presents organizations with the opportunity to traverse these gaps, enabling a vast library of applications to improve the perturbed workflows of unclaimed property teams. Automated data extraction, document comparison, fraudulent claim detection, and workflow completion analysis are just a few popular applications well suited for the unclaimed property space. In addition to the lagging technology currently deployed by many organizations, the unclaimed property landscape itself is evolving. Compliance issuance, asset availability, rates, the ability to collect fraudulently posted claims, and the claimant experience have all become hot-button items that are now front of mind for regulation agencies and businesses alike. Issuing duplication letters in a compliant manner, accommodating claimant inquiries regarding held assets, and managing, processing, and understanding the operational impact of rate changes are vexing problems many organizations now find themselves playing catch-up to address. The opportunity posed by cloud-enabled AI is furthered by economic, regulatory, and report cycle pressures on unclaimed property teams to do more with the same size or fewer resources. It’s now no longer simply a case of hitting the audit date deadline and checking off a box but an emerging priority for businesses at all sides of the market, from Fortune 500 to mid-market firms. In-house shared service teams are comfortable in areas of monitoring and curating business data; however, unclaimed property is an unknown territory with a learning curve, compliance gaps, and operational holes that, if ignored, stand to scale up exponentially. The combined fallout from regulatory changes and the recent pandemic have only made the situation riskier, with increased volatility in balancing time-sensitive tasks against stringent regulatory deadlines and growing claimant outreach.
Figures
PreviousNext
Review Article
Open Access December 29, 2020

Enhancing Government Fiscal Impact Analysis with Integrated Big Data and Cloud-Based Analytics Platforms

Abstract While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that [...] Read more.
While several application domains are exploiting the added-value of analytics over various datasets to obtain actionable insights and drive decision making, the public policy management domain has not yet taken advantage of the full potential of the aforementioned analytics and data models. To this end, in this paper authors present an overall architecture of a cloud-based environment that facilitates data retrieval and analytics, as well as policy modelling, creation and optimization. The environment enables data collection from heterogeneous sources, linking and aggregation, complemented with data cleaning and interoperability techniques. An innovative approach for analytics as a service is introduced and linked with a policy development toolkit, which is an integrated web-based environment to fulfil the requirements of the public policy ecosystem stakeholders [1]. Large information databases on various public issues exist, but their usage for public policy formulation and impact analysis has been limited so far, as no cloud-based service ecosystem exists to facilitate their efficient exploitation. With the increasing availability and importance of both public big and traditional data, the need to extract, link and utilize such information efficiently has arisen. Current data-driven web technologies and models are not aligned with the needs of this domain, and therefore, potential candidates for big data, cloud-based and service-oriented public policy analysis solutions should be investigated, piloted and demonstrated [2]. This paper presents the conceptual architecture of such an ecosystem based on the capabilities of state-of-the-art cloud and web technologies, as well as the requirements of its users.
Figures
PreviousNext
Review Article
Open Access November 24, 2022

Bridging Traditional ETL Pipelines with AI Enhanced Data Workflows: Foundations of Intelligent Automation in Data Engineering

Abstract Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data [...] Read more.
Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data Engineering and Automation framework offers the groundwork for intelligent automation processes. However, ML/AI are not the only disruptive forces; new Big Data technologies inspired by Web2.0 companies are also reshaping the Internet. Companies having the largest Big Data footprints not only provide applications with a Big Data operational model but also source their competitive advantage from data in the form of AI services and, consequently, impact the cost/performance equilibrium of ETL pipelines. All these technologies and reasons help explain why the traditional ETL pipeline design should adapt to current and emerging technologies and may be enhanced through artificial intelligence.
Figures
PreviousNext
Article
Open Access December 24, 2022

Cloud Native ETL Pipelines for Real Time Claims Processing in Large Scale Insurers

Abstract Cloud native ETL pipelines support the extract and transform phases of real time claims processing in large scale insurers. The cloud native approach offers dramatic improvements in scalability, reliability, resiliency and agility as well as seamless integration with the diverse set of data sources, destinations and technologies characteristic of large scale insurers. The ETL process extracts data [...] Read more.
Cloud native ETL pipelines support the extract and transform phases of real time claims processing in large scale insurers. The cloud native approach offers dramatic improvements in scalability, reliability, resiliency and agility as well as seamless integration with the diverse set of data sources, destinations and technologies characteristic of large scale insurers. The ETL process extracts data from source systems such as core transaction, fraud, customer and accounting processes, transforms the data to create a usable format for analytics and other applications, and loads the resulting tables into business intelligence or data lake systems for subsequent storage and analysis. By addressing these two phases of the overall ETL process, cloud native ETL pipelines can provide timely, reliable and consistent data to data scientists, actuaries, underwriters and other analysts. Real time processing represents a key priority within the overall claims process: faster, more accurate claim approvals reduce insurer costs, improve customer service and enhance premium pricing. As a result, a variety of claims related use cases are moving from batch to real time.
Figures
PreviousNext
Review Article
Open Access December 21, 2021

Optimizing Data Warehousing for Large Scale Policy Management Using Advanced ETL Frameworks

Abstract Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the [...] Read more.
Data warehousing is a technique for collecting, managing, and presenting data to help people analyze and use that data effectively. It involves a large database designed to support management-level staff by providing all the relevant historical data for analysis. This chapter begins with a definition of data warehousing, followed by an overview of large-scale policy management to highlight the need for data warehousing. Next, an overview of an ETL framework is presented, along with a discussion of advanced ETL techniques. The chapter concludes with an outline of performance optimization techniques for data warehousing. Data warehousing is considered a key enabler for efficient reporting and analysis, with implementation choices ranging from cost-effective desktop systems to large-scale, mission-critical data marts and warehouses containing petabytes of data. Extract, transform, and load (ETL) systems remain one of the largest cost and effort areas within data warehouse development projects, requiring significant planning and resources to build, manage, and monitor the flow of data from source systems into the data warehouse. The technology and techniques used for ETL can greatly influence the success or failure of a data warehouse. Complex business requirements for data cleansing, loading, transformation, and integration have intensified, while operational plans for real-time and near-real-time reporting add additional challenges. Parallel loading mechanisms, incremental data loading, and runtime update and insert strategies not only improve ETL performance but also optimize data warehousing performance, particularly for large-scale policy management.
Figures
PreviousNext
Article

Query parameters

Keyword:  Extract

View options

Citations of

Views of

Downloads of