Filter options

Publication Date
From
to
Subjects
Journals
Article Types
Countries / Territories
Open Access December 27, 2019

Revolutionizing Patient Care and Digital Infrastructure: Integrating Cloud Computing and Advanced Data Engineering for Industry Innovation

Abstract This work details how the integration of cloud computing and advanced data engineering can innovate and reshape patient care and digital infrastructure. In the healthcare sector, cloud services offer the necessary support to generate digitally-oriented services and service kits. These services can contain high levels of availability, low levels of latency, and on-demand scaling capabilities, while [...] Read more.
This work details how the integration of cloud computing and advanced data engineering can innovate and reshape patient care and digital infrastructure. In the healthcare sector, cloud services offer the necessary support to generate digitally-oriented services and service kits. These services can contain high levels of availability, low levels of latency, and on-demand scaling capabilities, while following the strictest data protection laws and regulations. On the other hand, these services can be combined with data engineering techniques to construct an ecosystem that enhances and adds an optimized data layer on any cloud environment. This ecosystem includes technologies to acquire, process, and manage healthcare data while respecting all regulatory obligations and institutions and can be part of a comprehensive digitalization strategy. The objective is to augment the healthcare services that the industry offers by leveraging healthcare data and AI technologies. Designed services, processes, and technologies can be described either as industry-agnostic services or healthcare-specific services that process and manage electronic healthcare records (EHR). Industry-agnostic services offer a set of tools and methodologies to conduct optimized data experiments. The goal is to exploit any variety, velocity, volume, and veracity of medical data. Healthcare-specific services offer a set of tools and methodologies to connect to any common EHR vendor in a privacy-preserving manner. Participating companies are thus able to hold, share, and make use of healthcare data in real-time. The proposed architecture can be transformative for the healthcare industry, opening up and facilitating experimentation on new and scalable service models. The transition to a more digital health approach would help overcome the limits encountered in traditional settings. Limitations in the availability of healthcare facilities and healthcare professionals have underpinned the increasing share of telemedicine in the care process. However, the record-keeping of the patients that undergo care outside of traditional healthcare facilities is often missing and can severely influence the continuity of treatment. Identifying new methods to implement disease prevention and early intervention processes is crucial to avoid more extensive treatment and to support those on multiple line therapies. For chronic patients, having a service available that monitors the state of health and intervenes when parameters go off the wanted range is crucial. However, the same patients are the most under the influence of the decision of care providers; a second opinion might be given remotely which the patient can access at any time on-demand. To address these different kinds of services, an ecosystem composed of a dictionary's worth data layer is outlined, able to live and operate seamlessly in any cloud environment. This future work's envisioned outcome is the rapid evolution and re-definition of the European healthcare landscape.
Figures
PreviousNext
Review Article
Open Access December 27, 2019

Data Engineering Frameworks for Optimizing Community Health Surveillance Systems

Abstract A Changing World Demands Optimized Health Surveillance Systems – and How Data Engineering Can Help There is a growing urgency to manage the public health and emergency response practices effectively today, in light of complex and emerging health threats. Fortunately, a host of new tools, including big and streaming data sources, methods such as machine learning, new types of hardware like [...] Read more.
A Changing World Demands Optimized Health Surveillance Systems – and How Data Engineering Can Help There is a growing urgency to manage the public health and emergency response practices effectively today, in light of complex and emerging health threats. Fortunately, a host of new tools, including big and streaming data sources, methods such as machine learning, new types of hardware like blockchain or secure enclaves, and means of data storage and retrieval, have emerged. But, with these innovations comes a grand challenge: how to blend with, and adapt them to, the traditional public health practices. The long-in-place infrastructures and protocols to protect and ensure the welfare of communities are in need of change, or at least update, to enhance their marked longevity of impact directly on the health outcomes and community wellbeing they were designed to fortify. It is in this vein that the essay is written and composed. The investigation in this essay is to query what, particularly, might be the aspects and influences of the emerging veritable cornucopia of new data engineering frameworks that are either being developed specifically for health surveillance and wellness, or are available to be co opted from devices and services already thriving in the current market and research milieu. Knowing what these ways may be could well aid in molding their uptake and spread, ensuring their beneficial impacts on those communities who stand to gain the most. The essay is divided into several key segments. After this introduction, section two details the research methods. In the section that follows, the maximum health outcome potentials of these novel frameworks are reviewed. Part four of the essay takes a more critical approach, addressing how the success of these methods may be hindered and future research avenues. Lastly, the concluding information suggests some actions to take to aid best suit the implementation of these ways, and suggests some thoughts for further research after the completion of these inquiriestrand [1].
Figures
PreviousNext
Case Report
Open Access November 24, 2022

Bridging Traditional ETL Pipelines with AI Enhanced Data Workflows: Foundations of Intelligent Automation in Data Engineering

Abstract Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data [...] Read more.
Machine Learning (ML) and Artificial Intelligence (AI) are having an increasingly transformative impact on all industries and are already used in many mission-critical use cases in production, bringing considerable value. Data engineering, which combines ETL pipelines with other workflows managing data and machine learning operations, is also significantly impacted. The Intelligent Data Engineering and Automation framework offers the groundwork for intelligent automation processes. However, ML/AI are not the only disruptive forces; new Big Data technologies inspired by Web2.0 companies are also reshaping the Internet. Companies having the largest Big Data footprints not only provide applications with a Big Data operational model but also source their competitive advantage from data in the form of AI services and, consequently, impact the cost/performance equilibrium of ETL pipelines. All these technologies and reasons help explain why the traditional ETL pipeline design should adapt to current and emerging technologies and may be enhanced through artificial intelligence.
Figures
PreviousNext
Article
Open Access December 27, 2020

Improving Data Quality and Lineage in Regulated Financial Data Platforms

Abstract Data quality and data lineage are critical concerns for organizations mandated to comply with stringent regulatory regimes. This paper analyses the latest developments in the governance of data quality and data lineage within a regulated financial services organisation. It sets out the underlying regulatory context, describes the concepts employed in the business environment, summarizes how data [...] Read more.
Data quality and data lineage are critical concerns for organizations mandated to comply with stringent regulatory regimes. This paper analyses the latest developments in the governance of data quality and data lineage within a regulated financial services organisation. It sets out the underlying regulatory context, describes the concepts employed in the business environment, summarizes how data quality is captured and monitored, examines the artefacts that record data lineage, reviews the roles and responsibilities of staff who implement the necessary processes, and maps areas where improvements are possible. The internal organization and processes of regulated data platforms are shaped not only by the capabilities prescribed by their technical architecture but also by the regulatory regimes under which they operate. These mandates, in particular, require rigorous examination of four aspects of data quality — accuracy, completeness, consistency, and timeliness — and detailed documentation of how data arrives in its final form in the repository. Although data monitoring, alerting, assessment, and remediation are well established, provenance capture remains an area ripe for further investment.
Figures
PreviousNext
Review Article

Query parameters

Keyword:  Data Engineering

View options

Citations of

Views of

Downloads of