| | JUNE 20168CIOReviewBy Bryon Campbell, Ph.D., CIO, Van Andel InstituteBryon CampbellLeveraging Biomedical Big Data: Van Andel Institute's Hybrid SolutionBig data can be a lifesaver...literally. In fact, the efficient handling and analysis of large, complex datasets in biomedical research plays an integral role in developing new ways to prevent, diagnose and treat diseases. Scientists engage in a wide range of data-intensive research projects using high-resolution imaging, genomic sequencing instruments and molecular modeling simulations to detail processes such as gene expression and protein behavior. Because the research conducted in just one laboratory can produce billions of data points, and research techniques evolve at a rapid pace, it is increasingly important for research facilities to architect solutions that can scale as requirements change.At Van Andel Institute (VAI)--a nonprofit biomedical research and science education organization in Grand Rapids, Michigan--we have managed these big data challenges by embracing cloud computing and implementing a hybrid OpenStack high performance computing (HPC) system. This new infrastructure significantly improves our IT flexibility, while providing users cutting-edge computational resources. The solution saved us roughly two years of development time. Anticipating Technological ChangeThe Institute is home to 28 principal investigators and their laboratories that study epigenetics, cancer and neurodegenerative diseases such as Parkinson's, and are dedicated to translating those findings into effe- ctive therapies.As VAI has increased collaboration with other research institutions and large scale bioinformatics projects around the world, scientific investigations have become much more complex. In recent years, this has led to the formation of research groups that require the ability to work on terabyte and petabyte scale data projects. In addition to having the storage and CPU capability to process and analyze scientific big data, we also wanted a creative approach to future-proofing our inevitable need for more compu- tational resources. In 2014, we realized that the science at the Institute was driving the need for exponentially higher volume, higher-speed computational resources. We knew that cloud-based, high-performance computing would soon be the new standard. And although the big players in cloud computing had lowered their prices in recent years, we needed to have a computing solution in-house that gave our scientists direct access to higher speeds.The continual on-boarding of big data-dependent scientists with very diverse system requirements and aggressive timelines meant that we had to explore alternative ways to deliver computing resources to our users.VAI's relatively small size and the fact that there would be no legacy equipment to work around made us agile enough to consider a IN MY OPINION
<
Page 7 |
Page 9 >