High performance computing in multiscale cardiac modeling: Bridging proteins to cells to whole heart
The talk will cover current methods to simulate the cardiac tissue at multiple spatial and temporal scales. The first part of the talk will describe a mechanistic model to understand the fundamental properties of the contractile proteins in cardiac cells. The model is computationally expensive and requires a supercomputer to simulate sub-cellular structures. While useful to understand biophysical questions, such detailed models are obviously impractical to model the billions of cells that comprise a whole human heart. The second part of the talk will cover the efforts to bridge from cells to large organ-level anatomical structures with practical run times. Specifically, a method is developed to distribute the computational workload over 8196 computational nodes on a IBM Blue Gene System. By using optimal recursive bisection, the anatomical data of the Visible Men ventricles is distributed to the computational nodes. The segments were created with an overlap region to be able to compute the spatial derivatives. The anatomical data dictated up to 1.44 billion elements with a size of 0.1 mm, a spatial increment that approximate real cell dimensions. This decompositions method represents an increase of roughly 2 orders of magnitude over existing published methods that are limited to hundreds of computational cores. Small numbers of computational cores limit spatial resolution or produce impractical runtimes. Hence, the results show that appropriate decomposition methods to large number of cores can make whole heart simulations practical with high spatial resolution and reasonable runtimes. The goal if this work is to foster wider use of cardiac models for research and clinical applications.