Research Data Services, Expertise & Technology Solutions

You are here

Virtual Humans

Primary tabs

The Virtual Physiological Human (VPH) project aimed at making collaborative investigation of the human body possible across all the relevant scales (from the level of molecules through cells and organs to the whole body). It introduced multiscale methodologies (that is, working on different scales within the body) into medical and clinical research – for example, in predictive disease modelling or to optimize cancer treatment for an individual.

The VPH community is developing a framework, through projects such as the personalized medicine “p-medicine” project and VPH-Share, that will provide the organisational fabric for this. It will be realised as a series of services offered within an integrated framework that will enable researchers and clinicians to share and manage data, information and tools. This will facilitate collaboration between the members of the VPH community and also enable researchers to create and run new VPH workflows.

The eventual outcome will be better disease diagnosis and treatment, along with improved prevention tools in healthcare.

The Scientific Challenge

The virtual humans are based on data collected from real patients, including biological, imaging, clinical and genomic data. As the data is unique to each patient, it will enable academic, clinical and industrial researchers to improve their understanding of human physiology and pathology, to derive predictive hypotheses and simulations, and to develop and test new therapies. One of the major challenges faced by the VPH initiative is handling patient data and data generated from simulating patient reaction to certain treatments. Modelling, storing, sharing and processing large volumes of data, and the visualization of results, will play a central role in achieving VPH objectives, which clearly opens for relevant areas of collaboration with the EUDAT initiative.

Who benefits and how?

The VPH Research Community will be able to build on the generic data services provided by EUDAT to create rich, community-specific analysis platforms. The fact that many EUDAT partners are also large HPC centres participating in PRACE will make it easy for VPH researchers to co-locate their data with high performance computing resources.

Technical Implementation

VPH is using EUDAT services to store data safely and to move it from where it is stored to the supercomputer resources where complex calculations are performed. More specifically, we are using B2SAFE to ingest VPH data sets (that is, to read the data and save it) into EUDAT resources for preservation purposes, and also to replicate (that is, make copies) of our data across multiple EUDAT nodes in order to make it easier to access the data and also make it safer (in case one copy of a particular set of data is lost or damaged). B2STAGE is being used for staging (that is, moving) data from EUDAT storage resources to PRACE work areas (namely high performance computing systems) and vice versa, and also to help run scientific applications on those PRACE resources with our VPH data.

 

VPH plans to continue these data storing activities with the “p-medicine” project using B2STAGE, and also intend to promote EUDAT to new VPH users through the VPH Institute. Something new that we will be looking at is using B2FIND to make it possible to search the VPH data. In addition, we are aiming to integrate EUDAT’s services with future VPH-related projects, such as the CompBioMed proposal (which was submitted to a recent H2020 Centres of Excellence for Computing Applications call) and in a VPH-related proposal submitted to the combined PRACE/EUDAT call.

Contacts

  • Peter Coveney, Centre for Computational Science (CCS) University College London (UCL), p.v.coveney(at)ucl.ac.uk

Further Information

To understand more about VPH and it's collaboration with EUDAT, read the interview with Stefan Zasada, Senior Research Associate at University College London (UCL), on how EUDAT Glues Virtual Physiological Human Framework Together

Category: