Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access June 02, 2025

Residual Sets and the Density of Binary Goldbach Representations

Abstract A residual-set framework is introduced for analyzing additive prime conjectures, with particular emphasis on the Strong Goldbach Conjecture (SGC). For each even integer En4, the residual set [...] Read more.
A residual-set framework is introduced for analyzing additive prime conjectures, with particular emphasis on the Strong Goldbach Conjecture (SGC). For each even integer En4, the residual set (En)={Enp p<En,p} is defined, and the universal residual set E=En(En) is constructed. It is shown that E contains infinitely many primes. A nontrivial constructive lower bound is derived, establishing that the number of Goldbach partitions satisfies G(E)2 for all E8, and that the cumulative partition count satisfies ENG(E)N2log4N. An optimized deterministic algorithm is implemented to verify the SGC for even integers up to 16,000 digits. Each computed partition En=p+q is validated using elliptic curve primality testing, and no exceptions are observed. Runtime variability observed in the empirical tests corresponds with known fluctuations in prime density and modular residue distribution. A recursive construction is formulated for generating Goldbach partitions, using residual descent and leveraging properties of the residual sets. The method extends naturally to Lemoine's Conjecture, asserting that every odd integer n7 can be expressed as n=p+2q, where p,q. A corresponding residual formulation is developed, and it is proven that at least two valid partitions exist for all n9. Comparative analysis with the Hardy-Littlewood and Chen estimates is provided to contextualize the cumulative growth rate. The residual-set methodology offers a deterministic, scalable, and structurally grounded approach to additive problems in prime number theory, supported by both theoretical results and large-scale computational evidence.
Figures
PreviousNext
Article
Open Access January 11, 2025

Exploring LiDAR Applications for Urban Feature Detection: Leveraging AI for Enhanced Feature Extraction from LiDAR Data

Abstract The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is [...] Read more.
The integration of LiDAR and Artificial Intelligence (AI) has revolutionized feature detection in urban environments. LiDAR systems, which utilize pulsed laser emissions and reflection measurements, produce detailed 3D maps of urban landscapes. When combined with AI, this data enables accurate identification of urban features such as buildings, green spaces, and infrastructure. This synergy is crucial for enhancing urban development, environmental monitoring, and advancing smart city governance. LiDAR, known for its high-resolution 3D data capture capabilities, paired with AI, particularly deep learning algorithms, facilitates advanced analysis and interpretation of urban areas. This combination supports precise mapping, real-time monitoring, and predictive modeling of urban growth and infrastructure. For instance, AI can process LiDAR data to identify patterns and anomalies, aiding in traffic management, environmental oversight, and infrastructure maintenance. These advancements not only improve urban living conditions but also contribute to sustainable development by optimizing resource use and reducing environmental impacts. Furthermore, AI-enhanced LiDAR is pivotal in advancing autonomous navigation and sophisticated spatial analysis, marking a significant step forward in urban management and evaluation. The reviewed paper highlights the geometric properties of LiDAR data, derived from spatial point positioning, and underscores the effectiveness of machine learning algorithms in object extraction from point clouds. The study also covers concepts related to LiDAR imaging, feature selection methods, and the identification of outliers in LiDAR point clouds. Findings demonstrate that AI algorithms, especially deep learning models, excel in analyzing high-resolution 3D LiDAR data for accurate urban feature identification and classification. These models leverage extensive datasets to detect patterns and anomalies, improving the detection of buildings, roads, vegetation, and other elements. Automating feature extraction with AI minimizes the need for manual analysis, thereby enhancing urban planning and management efficiency. Additionally, AI methods continually improve with more data, leading to increasingly precise feature detection. The results indicate that the pulse emitted by continuous wave LiDAR sensors changes when encountering obstacles, causing discrepancies in measured physical parameters.
Figures
PreviousNext
Article
Open Access January 10, 2025

Artificial Immune Systems: A Bio-Inspired Paradigm for Computational Intelligence

Abstract Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a [...] Read more.
Artificial Immune Systems (AIS) are bio-inspired computational frameworks that emulate the adaptive mechanisms of the human immune system, such as self/non-self discrimination, clonal selection, and immune memory. These systems have demonstrated significant potential in addressing complex challenges across optimization, anomaly detection, and adaptive system control. This paper provides a comprehensive exploration of AIS applications in domains such as cybersecurity, resource allocation, and autonomous systems, highlighting the growing importance of hybrid AIS models. Recent advancements, including integrations with machine learning, quantum computing, and bioinformatics, are discussed as solutions to scalability, high-dimensional data processing, and efficiency challenges. Core algorithms, such as the Negative Selection Algorithm (NSA) and Clonal Selection Algorithm (CSA), are examined, along with limitations in interpretability and compatibility with emerging AI paradigms. The paper concludes by proposing future research directions, emphasizing scalable hybrid frameworks, quantum-inspired approaches, and real-time adaptive systems, underscoring AIS's transformative potential across diverse computational fields.
Figures
PreviousNext
Article
Open Access November 01, 2023

Individual Wave Component Signal Modeling, Parameters Extraction, and Analysis

Abstract The accurate estimation of Individual Wave Components (IWC) is crucial for automated diagnosis of the human digestive system in a clinical setting. However, this process can be challenging due to signal contamination by other signal sources in the body, such as the lungs and heart, as well as environmental noise. To address this issue, various denoising techniques are commonly employed in bowel [...] Read more.
The accurate estimation of Individual Wave Components (IWC) is crucial for automated diagnosis of the human digestive system in a clinical setting. However, this process can be challenging due to signal contamination by other signal sources in the body, such as the lungs and heart, as well as environmental noise. To address this issue, various denoising techniques are commonly employed in bowel sound signal processing. While denoising is important, it can increase computational complexity, making it challenging for portable devices. Therefore, signal processing algorithms often require a trade-off between fidelity and computational complexity. This study aims to evaluate an IWC parameter extraction algorithm that was previously developed and reconstruct the IWC without denoising using synthetic and clinical data. To that end, the role of a reliable model in creating synthetic data is paramount. The rigorous testing of the algorithm is limited by the availability of quality and quantity recorded data. To overcome this challenge, a mathematical model has been proposed to generate synthetic bowel sound data that can be used to test new algorithms. The proposed algorithm’s robust performance is evaluated using both synthetic and clinically recorded data. We perform time-frequency analysis of original and reconstructed bowel sound signals in various digestive system states and characterize the performance using Monte Carlo simulation when denoising is not applied. Overall, our study presents a promising algorithm for accurate IWC estimation that can be useful for predicting anomalies in the digestive system.
Figures
PreviousNext
Article

Query parameters

Keyword:  Algorithm

View options

Citations of

Views of

Downloads of