Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access April 10, 2025

Advancements in Pharmaceutical IT: Transforming the Industry with ERP Systems

Abstract The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data [...] Read more.
The pharmaceutical industry is undergoing a profound transformation driven by advancements in Information Technology (IT), with Enterprise Resource Planning (ERP) systems playing a pivotal role in reshaping operations. These systems offer integrated solutions that streamline key business processes, such as production, inventory management, supply chain optimization, regulatory compliance, and data integration, contributing significantly to operational efficiency and organizational agility. This paper explores the evolution and impact of ERP systems within the pharmaceutical sector, highlighting their contributions to overcoming the industry’s inherent challenges, including complex regulatory requirements, the need for accurate and real-time data, and the demand for supply chain resilience. The integration of cloud-based ERP solutions, the incorporation of emerging technologies like Artificial Intelligence (AI), Machine Learning (ML), and the Internet of Things (IoT), and enhanced data analytics capabilities have revolutionized pharmaceutical IT. These advancements not only reduce operational costs, improve forecasting accuracy, and enhance collaboration but also ensure compliance with stringent global regulations, such as Good Manufacturing Practices (GMP) and FDA guidelines. Moreover, ERP systems have been instrumental in managing the pharmaceutical supply chain, ensuring product traceability, and improving inventory control and order fulfillment processes. This manuscript examines how ERP systems enable pharmaceutical companies to maintain high standards of product quality, improve decision-making, and ensure the safety and efficacy of drugs through robust tracking and auditing mechanisms. A case study of a pharmaceutical company that implemented an ERP system demonstrates the tangible benefits, including increased operational efficiency, improved compliance rates, and enhanced customer satisfaction. However, despite the clear advantages, challenges such as customization complexities, data integration issues, and resistance to change remain. As the pharmaceutical industry continues to evolve, ERP systems will remain a cornerstone of digital transformation, facilitating smarter decision-making, better resource management, and enhanced collaboration across global operations. This paper also identifies future trends, including the potential of AI and blockchain technologies in further strengthening ERP systems and transforming the pharmaceutical landscape.
Review Article
Open Access March 22, 2025

Enhancing Scalability and Performance in Analytics Data Acquisition through Spark Parallelism

Abstract Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale [...] Read more.
Data acquisition serves as a critical component of modern data architecture, with REST API integration emerging as one of the most common approaches for sourcing external data. This study evaluates the efficiency of various methodologies for collecting data via REST APIs and benchmark their performance. It explores how leveraging the Spark distributed computing platform can optimize large scale REST API calls, enabling enhanced scalability and improved processing speeds to meet the demands of high volume data workflows.
Figures
PreviousNext
Review Article
Open Access October 15, 2022

Big Data and AI/ML in Threat Detection: A New Era of Cybersecurity

Abstract The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even [...] Read more.
The unrelenting proliferation of data, entwined with the prevalence of mobile devices, has given birth to an unprecedented growth of information obscured by noise. With the Internet of Things and myriad endpoint devices generating vast volumes of sensitive and critical data, organizations are tasked with extracting actionable intelligence from this deluge. Governments and enterprises alike, even under pressure from regulatory boards, have strived to harness the power of data and leverage it to enhance safety and security, maximize performance, and mitigate risks. However, the adversaries themselves have capitalized on the unequal battle of big data and artificial intelligence to inflict widespread chaos. Therefore, the demand for big data analytics and AI/ML for high-fidelity intelligence, surveillance, and reconnaissance is at its highest. Today, in the cybersecurity realm, the detection of adverse incidents poses substantial challenges due to the sheer variety, volume, and velocity of deep packet inspection data. State-of-the-art detection techniques have fallen short of detecting the latest attacks after a big data breach incident. On the other hand, computational intelligence techniques such as machine learning have reignited the search for solutions for diverse monitoring problems. Recent advancements in AI/ML frameworks have the potential to analyze IoT/edge-generated big data in near real-time and assist risk assessment and mitigation through automated threat detection and modeling in the big data and AI/ML domain. Industry best practices and case studies are examined that endeavor to showcase how big data coupled with AI/ML unlocks new dimensions and capabilities in improved vigilance and monitoring, prediction of adverse incidents, intelligent modeling, and future uncertainty quantification by data resampling correction. All of these avenues lead to enhanced robustness, security, safety, and performance of industrial processes, computing, and infrastructures. A view of the future and how the potential threats due to the misuse of new technologies from bandwidth to IoT/edge, blockchain, AI, quantum, and autonomous fields is discussed. Cybersecurity is again playing out at a pace set by adversaries with low entry barriers and debilitating tools. The need for innovative solutions for defense from the emerging threat landscape, harnessing the power of new technologies and collaboration, is emphasized.
Figures
PreviousNext
Article
Open Access October 29, 2022

Neural Networks for Enhancing Rail Safety and Security: Real-Time Monitoring and Incident Prediction

Abstract The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the [...] Read more.
The growth in demand for rail transportation systems within cities, together with high-speed and long-distance transportation running on a rail network, raises the issues of both rail safety and security. If an accident or an attack occurs, its consequences can be extremely severe. To mitigate the impact of these events, the real-time monitoring of a rail system is required. In that case, the improvements in monitoring can be achieved using artificial intelligence algorithms such as neural networks. Neural networks have been used to achieve real-time incident identification in monitoring the track quality in terms of classifying the graphical outputs of an ultrasonic system working with the rails and track bed, to predict incidents on the rail infrastructure due to transmission channels becoming blocked, and also to attempt scheduling preemptive and preventative maintenance. In terms of forecasting incidents and accidents on board the trains, neural networks have been used to model passenger behavior and optimize responses during a train station evacuation. In tackling the incidents and accidents occurring on rail transport, we contribute with two methodologies to detect anomalies in real-time and identify the level of security risk: at the maintenance level with personnel operating along the railways, and onboard passenger trains. These methodologies were evaluated on real-world datasets and shown to be able to achieve a high accuracy in the results. The results generated from these case studies also reveal the potential for network-wide applications, which could enhance security and safety on railway networks by offering the possibility of better managing network disruptions and more rapidly identifying security issues. The speed and coverage of the information generated through the implementation of these methodologies have implications in utilizing prediction for decision support and enhancing safety and security on board the rail network.
Figures
PreviousNext
Review Article
Open Access October 30, 2022

Towards Autonomous Analytics: The Evolution of Self-Service BI Platforms with Machine Learning Integration

Abstract Self-service business intelligence (BI) platforms have become essential applications for exploring, analyzing, and visualizing business data in various domains. Here, we envisage that the business intelligence platform will perform automatic and autonomous data analytics with minimal to no user interaction. We aim to offer a data-driven, intelligent, and scalable infrastructure that amplifies the [...] Read more.
Self-service business intelligence (BI) platforms have become essential applications for exploring, analyzing, and visualizing business data in various domains. Here, we envisage that the business intelligence platform will perform automatic and autonomous data analytics with minimal to no user interaction. We aim to offer a data-driven, intelligent, and scalable infrastructure that amplifies the advantages of BI systems and discovers hidden and complex insights from very large business datasets, which a business analyst can miss during manual exploratory data analysis. Towards our future vision of autonomous analytics, we propose a collective machine learning model repository with an integration layer for user-defined analytical goals within the BI platform. The proposed architecture can effectively reduce the cognitive load on users for repetitive tasks, democratizing data science expertise across data workers and facilitating a less experienced user community to develop and use advanced machine learning and statistical algorithms.
Figures
PreviousNext
Review Article
Open Access November 05, 2022

Application of Neural Networks in Optimizing Health Outcomes in Medicare Advantage and Supplement Plans

Abstract The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, [...] Read more.
The growing complexity and variability in healthcare delivery and costs within Medicare Advantage (MA) and Medicare Supplement (Medigap) plans present significant challenges for improving health outcomes and managing expenditures. Neural networks, a subset of artificial intelligence (AI), have shown considerable promise in optimizing healthcare processes, particularly in predictive modeling, personalized treatment recommendations, and risk stratification. This paper explores the application of neural networks in enhancing health outcomes within the context of Medicare Advantage and Supplement plans. We review how deep learning models can be leveraged to predict patient risk, optimize resource allocation, and identify at-risk populations for preventive interventions. Additionally, we discuss the potential for neural networks to improve claims processing, reduce fraud, and streamline administrative burdens. By integrating various data sources, including medical records, claims data, and demographic information, neural networks enable more accurate and efficient decision-making processes. Ultimately, this approach can lead to better patient care, reduced healthcare costs, and improved satisfaction for beneficiaries of these programs. The paper concludes by highlighting the current limitations, ethical considerations, and future directions for AI adoption in the Medicare Advantage and Supplement sectors.
Figures
PreviousNext
Review Article
Open Access February 22, 2023

Navigating the Pharmaceutical Supply Chain: Key Strategies for Balancing Demand and Supply

Abstract The pharmaceutical industry is fundamental to global healthcare, providing essential medicines that improve health outcomes and quality of life. However, the demand and supply dynamics within this sector are highly complex, shaped by various factors including demographic changes, evolving disease burdens, technological advancements, regulatory challenges, and economic pressures. This manuscript [...] Read more.
The pharmaceutical industry is fundamental to global healthcare, providing essential medicines that improve health outcomes and quality of life. However, the demand and supply dynamics within this sector are highly complex, shaped by various factors including demographic changes, evolving disease burdens, technological advancements, regulatory challenges, and economic pressures. This manuscript explores the intricate relationship between pharmaceutical medicine demand and supply, focusing on key strategies that can help companies effectively navigate these challenges. The demand for pharmaceutical products is driven by several factors, such as population growth, the aging population, the rise of chronic diseases, and the emergence of new health threats. Additionally, healthcare accessibility, affordability, and policy changes significantly impact the consumption of medicines, while innovations in medical technologies and therapies create new treatment needs. On the supply side, pharmaceutical companies face challenges related to manufacturing capacity, raw material availability, distribution logistics, and compliance with ever-evolving global regulatory frameworks. To address these challenges, the manuscript discusses strategic approaches to managing both demand and supply in the pharmaceutical sector. Key strategies include advanced demand forecasting through data analytics, optimizing supply chains for efficiency and resilience, implementing just-in-time inventory models, and investing in flexible manufacturing systems. Furthermore, global collaboration and partnerships, as well as effective risk management practices, are highlighted as essential to ensuring the availability of medicines, particularly in times of crisis or global health emergencies. This manuscript also delves into the role of policy advocacy and regulatory harmonization in stabilizing the pharmaceutical market, ensuring that medicines are accessible to all populations. In conclusion, the pharmaceutical industry must continually adapt to meet the evolving challenges of demand and supply, embracing innovation and collaboration while maintaining a focus on patient access and global healthcare equity. Through strategic planning and adaptive solutions, the pharmaceutical sector can ensure the continuous availability of critical medicines worldwide, meeting both current and future health needs.
Case Report
Open Access November 19, 2022

Analyzing Behavioral Trends in Credit Card Fraud Patterns: Leveraging Federated Learning and Privacy-Preserving Artificial Intelligence Frameworks

Abstract We investigate and analyze the trends and behaviors in credit card fraud attacks and transactions. First, we perform logical analysis to find hidden patterns and trends, then we leverage game-theoretical models to illustrate the potential strategies of both the attackers and defenders. Next, we demonstrate the strength of industry-scale, privacy-preserving artificial intelligence solutions by [...] Read more.
We investigate and analyze the trends and behaviors in credit card fraud attacks and transactions. First, we perform logical analysis to find hidden patterns and trends, then we leverage game-theoretical models to illustrate the potential strategies of both the attackers and defenders. Next, we demonstrate the strength of industry-scale, privacy-preserving artificial intelligence solutions by presenting the results from our recent exploratory study in this respect. Furthermore, we describe the intrinsic challenges in the context of developing reliable predictive models using more stringent protocols, and hence the need for sector-specific benchmark datasets, and provide potential solutions based on state-of-the-art privacy models. Finally, we conclude the paper by discussing future research lines on the topic, and also the possible real-life implications. The paper underscores the challenges in creating robust AI models for the banking sector. The results also showcase that privacy-preserving AI models can potentially augment sharing capabilities while mitigating liability issues of public-private sector partnerships [1].
Figures
PreviousNext
Review Article
Open Access December 27, 2019

The Role of Neural Networks in Advancing Wearable Healthcare Technology Analytics

Abstract Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical [...] Read more.
Neural networks are bringing a transformation in wearable healthcare technology analytics. These networks are able to analyze a vast amount of data to help in making decisions concerning patient care. Advancements in deep learning have brought neural networks to the forefront, making data analytics a straightforward process. This study will help in unveiling the use of ICT and AI in medical healthcare technology, crawling through some industry giants. Wearable Healthcare Technologies are becoming more popular every day. These technologies facilitate collecting, monitoring, and sharing every vital aspect of the human body necessary for diagnosing and treating an ailment. At the advent of global digitization, health data storage and systematic analysis are taking shape to ensure better diagnostics, preventive, and predictive healthcare. Healthcare analytics powered by neural networks can significantly improve health outcomes, maximizing individuals' potential and quality of life. The breadth and possibilities of connected devices are getting wider. From personal activity monitoring to quantifying every bit of health statistics, connected devices are making an impact in measurement, management, and manipulation. In healthcare, early diagnosis could be a lifesaver. Data analytics can help in a big way to make moves and predictions to save lives. We are in another phase of the digitization era, "Neural Network and Wearable Healthcare Technology Analytics." A neural network could be conceived as an adaptive system made up of a large number of neurons connected in multiple layers. A neural network processes data in a similar way as the human brain does. Using a collection of algorithms, for many neural networks, objects are composed of 'input' and 'output' layers along with the layers of the neural network.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Revolutionizing Patient Care and Digital Infrastructure: Integrating Cloud Computing and Advanced Data Engineering for Industry Innovation

Abstract This work details how the integration of cloud computing and advanced data engineering can innovate and reshape patient care and digital infrastructure. In the healthcare sector, cloud services offer the necessary support to generate digitally-oriented services and service kits. These services can contain high levels of availability, low levels of latency, and on-demand scaling capabilities, while [...] Read more.
This work details how the integration of cloud computing and advanced data engineering can innovate and reshape patient care and digital infrastructure. In the healthcare sector, cloud services offer the necessary support to generate digitally-oriented services and service kits. These services can contain high levels of availability, low levels of latency, and on-demand scaling capabilities, while following the strictest data protection laws and regulations. On the other hand, these services can be combined with data engineering techniques to construct an ecosystem that enhances and adds an optimized data layer on any cloud environment. This ecosystem includes technologies to acquire, process, and manage healthcare data while respecting all regulatory obligations and institutions and can be part of a comprehensive digitalization strategy. The objective is to augment the healthcare services that the industry offers by leveraging healthcare data and AI technologies. Designed services, processes, and technologies can be described either as industry-agnostic services or healthcare-specific services that process and manage electronic healthcare records (EHR). Industry-agnostic services offer a set of tools and methodologies to conduct optimized data experiments. The goal is to exploit any variety, velocity, volume, and veracity of medical data. Healthcare-specific services offer a set of tools and methodologies to connect to any common EHR vendor in a privacy-preserving manner. Participating companies are thus able to hold, share, and make use of healthcare data in real-time. The proposed architecture can be transformative for the healthcare industry, opening up and facilitating experimentation on new and scalable service models. The transition to a more digital health approach would help overcome the limits encountered in traditional settings. Limitations in the availability of healthcare facilities and healthcare professionals have underpinned the increasing share of telemedicine in the care process. However, the record-keeping of the patients that undergo care outside of traditional healthcare facilities is often missing and can severely influence the continuity of treatment. Identifying new methods to implement disease prevention and early intervention processes is crucial to avoid more extensive treatment and to support those on multiple line therapies. For chronic patients, having a service available that monitors the state of health and intervenes when parameters go off the wanted range is crucial. However, the same patients are the most under the influence of the decision of care providers; a second opinion might be given remotely which the patient can access at any time on-demand. To address these different kinds of services, an ecosystem composed of a dictionary's worth data layer is outlined, able to live and operate seamlessly in any cloud environment. This future work's envisioned outcome is the rapid evolution and re-definition of the European healthcare landscape.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Data-Driven Innovation in Finance: Crafting Intelligent Solutions for Customer-Centric Service Delivery and Competitive Advantage

Abstract Innovations in computing and communication technologies are reshaping finance. The seismic changes are casting uncertainty about the future of financial services. On one hand, fintech evangelists project a rosy future, asserting that the fast-moving algorithms can deliver low-cost financial services intuitively, customized to meet robust consumer expectations. On the other hand, many finance [...] Read more.
Innovations in computing and communication technologies are reshaping finance. The seismic changes are casting uncertainty about the future of financial services. On one hand, fintech evangelists project a rosy future, asserting that the fast-moving algorithms can deliver low-cost financial services intuitively, customized to meet robust consumer expectations. On the other hand, many finance veterans fret that the traditional banking model could disintermediate, bleeding banks via a ‘death by a thousand cuts’, reducing them to passive portfolio holders with no direct customer relationship, eclipsed by digital giants which use their enormous treasure troves of customer data to offer banking as an added service with nearly free cost. Amidst the upbeat technological promises and apocalyptic forebodings, there are two constant, mostly agreed-upon, truths. The first is the vital importance of data. Advances in the internet, cloud computing, and record-keeping technologies are producing an ‘exponential growth in the volume and detail of data’. Some of this big data are personal information. Smartphones are deployed in almost all developed and emerging economies, serving as little spies; tracking, recording location histories, social networks, and app usage of their unsuspecting owners; often with a great degree of precision. ‘People are walking data-factories’ in this ‘mobile digital society’. Data are the fermentation of these global exchanges, electronic commerce and communication, and financial transactions. To just take Facebook as an example, it shares 30 million people a day through updates and posts, hosting personal information on 2.23 billion users. To the alarm of the uninformed public, much of this information is available for commercial harvest. The second constant is the rise of intelligent solutions. Consumers today—be it disclosed or not—are fed tailored clothes, music, film, holiday packages—almost anything you like, notably dynamic pricing, varying in accordance with individual profiles, or personalized search results. The availability of powerful computers has enabled comparable applications that are intended to make the system more responsive to their customer profiles and desires, or to capitalize competitive business possibilities. Such changes will transform the financial industry and occupy a prominent position among the mechanisms of policy competition, reshaping the way in which financial services are bestowed and led on the demand side.
Figures
PreviousNext
Review Article
Open Access December 27, 2020

Enhancing Regulatory Compliance in Finance through Big Data Analytics and AI Automation

Abstract This paper shows how Big Data Analytics (BDA) and Artificial Intelligence (AI) automation facilitate regulatory compliance in Finance. Regulatory compliance is essential in helping institutions to mitigate reputational, litigation, and financial risk. Existing literature reveals several preconditions for compliance. However, much of the literature has adopted an internal view of compliance without [...] Read more.
This paper shows how Big Data Analytics (BDA) and Artificial Intelligence (AI) automation facilitate regulatory compliance in Finance. Regulatory compliance is essential in helping institutions to mitigate reputational, litigation, and financial risk. Existing literature reveals several preconditions for compliance. However, much of the literature has adopted an internal view of compliance without considering external regulatory frameworks. This research draws on the cognitive model of regulation that looks at regulatory compliance as a social construct. It uses a triangulation research method comprising literature review, interview of trade compliance experts, and questionnaire survey of compliance practitioners to understand how regulation affects compliance and what role ICTs play in implementing compliance. The findings of this study present a regulatory compliance framework comprising four cognitive stages and a conceptual regulatory compliance system that presents how BDA and AI automation are applied to mitigate regulatory complexity and enhance regulatory compliance. The conceptual regulatory compliance system shows how BDA and AI enable institutions to dynamically assess regulatory risk, automatically monitor compliance, and intelligently predict risk violations mitigating regulatory complexity and preventing producing unnecessary documents. It provides theoretical contributions to understanding regulatory evolution and compliance and practical implications for understanding how regulation evolves to be more complicated and elements of a regulatory compliance system mitigate proliferating regulations. Additionally, it provides avenues for future research into the relationship between competing regulatory mandates and how institutions cope with that. Regulations are important for ensuring compliance and governance in finance and to curb systemic risk. Complying with regulations is difficult due to their growing volume, complexity, and fragmentation. Institutions use large-scale Information and Communication Technologies (ICTs), such as Big Data Analytics (BDA) and Artificial Intelligence (AI) automation, to monitor compliance and mitigate regulatory complexity. However, less is known about how firms comply with regulation. Most literature does not thoroughly investigate regulatory elements nor explicitly relate them to compliance.
Figures
PreviousNext
Review Article
Open Access December 26, 2021

Scalable Data Warehouse Architecture for Population Health Management and Predictive Analytics

Abstract Scalable architecture principles for data warehousing are introduced to support population health management and predictive analytics. These principles are validated through the design of an accompanying Data Pipeline that allows the integration of non-traditional data sources, the use of real-time data for descriptive analytics dashboards, and support for the generation of supervised Machine [...] Read more.
Scalable architecture principles for data warehousing are introduced to support population health management and predictive analytics. These principles are validated through the design of an accompanying Data Pipeline that allows the integration of non-traditional data sources, the use of real-time data for descriptive analytics dashboards, and support for the generation of supervised Machine Learning models. Several analytical capabilities have been implemented to exemplify the practical application of the principles, including predictive models for Risk Stratification in health care. Optimal cost-effectiveness and performance considerations ensure the practical relevance of the architectural principles and associated Data Pipeline. In recent years, the availability of Low-Cost Data Storage services and the increasing popularity of Streaming technologies opened new possibilities for the storage and processing of Streaming data on a near-real-time basis. These technologies can help Developing Countries in tackling many relevant issues such as Urban Planning, Environmental Management, Migration Policies, etc. A multi-tier approach combining Cloud-based Storage with Data Warehousing and Data Mining technologies can offer an interesting architecture to exploit Big Data related to populations.
Figures
PreviousNext
Review Article
Open Access December 26, 2021

Designing Scalable Healthcare Data Pipelines for Multi-Hospital Networks

Abstract Healthcare is increasingly recognized as a data-intensive industry. Multi-hospital networks, among other organizations, face mounting operational and governance challenges because of rigid data-integration pipelines that support all data sources and destinations in the network. These pipelines have become difficult to modify, causing them to lag behind the changing needs of the clinical operation. [...] Read more.
Healthcare is increasingly recognized as a data-intensive industry. Multi-hospital networks, among other organizations, face mounting operational and governance challenges because of rigid data-integration pipelines that support all data sources and destinations in the network. These pipelines have become difficult to modify, causing them to lag behind the changing needs of the clinical operation. Scalable data-pipeline architectures better support clinical decision making, optimize hospital operations, ease data quality and compliance concerns, and contribute to improved patient outcomes. Meeting scalability goals requires breaking up monolithic data-integration pipelines into smaller decoupled components and aligning service-level agreements of pipeline components and source systems. Parallelization and adoption of distributed data-warehouse technology mitigate the burden of ingesting data into a multi-hospital network. However, latency requirements still warrant the construction of separate pipelines for data ingress from clinical devices, electronic health records, and external laboratory-information systems. Healthcare associations recommend near real-time data availability for a growing list of clinical and operational applications. Mishandling the real-time ingestion of data from clinical devices, in particular, compromises availability and performance. Scalable architectural patterns for real-time streaming Ingestion from heterogeneous data sources, transport processes, and back-end processing structures are detailed.
Figures
PreviousNext
Review Article

Query parameters

Keyword:  Data Analytics

View options

Citations of

Views of

Downloads of