THANK YOU FOR SUBSCRIBING
Being Chief Information Officer (CIO) of any organization is a challenge, but being the CIO of the National Heart, Lung, and Blood Institute (NHLBI) at the National Institutes of Health (NIH) layers on some unique factors. We are a federal government agency with all that implies for a CIO–aligning IT investment with the Institute’s strategy, ensuring that public funds are used effectively and efficiently, dealing with the growing challenges of IT security, and ensuring that I am providing the right level of helpdesk support. Yet, we are also an organization that has many of the characteristics of academia: an environment where intellectual curiosity and the open exchange of ideas and knowledge can seem at times to challenge the responsibilities of a federal CIO.
"As the CIO of a biomedical research institute, I am constantly challenged to be a strategic partner with our scientists in order to provide the innovative tools they need"
Over the past few decades, IT has been seen as a useful adjunct to biomedical research–storage for data, computers on which to write and edit papers, and so forth. Today, biomedical research increasingly is IT and today’s discoveries are impossible without extensive IT resources. As with any CIO, I have to understand my organization’s business; I am not a scientist, but increasingly I find myself collaborating with scientists to help them find IT solutions facilitating their research. This collaboration requires me to think much more like a scientist than as a traditional CIO by applying the scientific method to experiment and develop the new IT tools and techniques needed by researchers. Today, this is often merged with one of the most common IT themes—the use of sophisticated cloud resources.
One excellent example of this has been NHLBI scientist Dr. Michael Hansen’s research into advanced computational methods relating to Magnetic Resonance Imaging (MRI). Hansen’s research team developed a new software tool for MRI called the Gadgetron, which takes raw data from an MRI scanner and quickly reconstructs the data into images a clinician can review and use to diagnose and treat disease. MRI is a relatively slow technique and one that is sensitive to motion, which means that MRI imaging of the heart is a slow examination and the patients have to hold their breath in order capture an artifact-free image–this being particularly challenging when the patient is a child. Hansen’s work now allows those examinations to be conducted in 10-15 minutes rather than the 30-40 minutes that is the norm, and to do so without breath holding. This both reduces stress for the patient and reduces the cost of a cardiac MRI.
Making the Gadgetron work requires a lot of IT resources–including up to 50 high-powered computers to process the data. Yet these resources are only required for the time the examination is taking place. We could have built a cluster of computers in our data center for this, but 90 percent of the time they would be sitting idle, though still consuming power and requiring cooling, which would be both expensive and wasteful for us. Hansen saw the potential of using the cloud to provide the resources–the pattern of usage in short, intense bursts fit very well with the elasticity of the cloud and, practically speaking, the costs of the cloud fit his research budget. After some initial experimentation to see if it could work, Hansen came to me asking to make use of the cloud for Gadgetron. The CIO in me quickly saw the cost benefits but also saw that sending sensitive patient data to the cloud might pose some significant risk to security and privacy. However, my team was able to work with Hansen to understand what was happening as the Gadgetron worked: that there was no personally identifiable information (PII) in the raw data, that the data was sent to the cloud computer cluster, worked on and then sent back without storage in the cloud, and that we could secure the communications channels with strong encryption. We were then able to work to provide the resources he needed. Initially, that included the cloud resources themselves and the IT security expertise to ensure that all the appropriate security controls were in place to manage the risk. Later, as the potential for commercialization grew and direct expertise was required, it was by engaging technical expertise from the cloud service providers to work with Hansen and his team to build what was needed to provide the reliability and scalability a commercial product would need.
As a direct result of this collaboration between scientists and IT professionals, the Gadgetron is now showing the benefits for patients and hospitals that were anticipated, and at a cost far below what could have been achieved without the cloud. As a side benefit, some of the tools developed for the Gadgetron have proven to be of broader value and are being introduced as additional cloud products.
In our world of biomedical research, big data and big computation has become the norm. Simulating the atomic level interactions between enzymes and DNA requires massive numbers of CPU cores. NHLBI’s Trans-Omics for Precision Medicine (TOPMed) initiative is generating 2.5 petabytes (PB) of data containing 70,000 whole genomes and requires computation resources co-located with the data to be able to perform analysis. Increasingly, this new science is only possible because of the capacity, power, security, and cost effectiveness of cloud services. But we have increasingly found it also has to be coupled with the expertise of the cloud service providers themselves to create the unique services required for research.
As the CIO of a biomedical research institute, I am constantly challenged to be a strategic partner with our scientists in order to provide the innovative tools they need. I also have to allow for the open exchange of ideas and data, and to provide it in a way that also meets the challenges I face as a federal government CIO, making every dollar count, doing more with less, and doing it more securely. The elastic scalability of the cloud allows me to make the best use of funds by only paying for what we use, while the level of investment in security controls, monitoring, and response made by cloud service providers in their environments allows them to much more effectively manage security than we could in our data center, and at a much lower cost for us. For these reasons, cloud services will increasingly be where advances in biomedical research are found.