How will medicine leap to the next stage? Many are anticipating an era of “personalized” medicine in which our specific medical conditions can be treated individually. The answers are going to depend not only on advances made in the laboratory, but also on new ways of modeling and simulating medical conditions on large-scale, interconnected computers.
If doctors could watch how blood circulates through our arteries in a kind of 3-D real-time movie, they could see exactly where high blood pressure affects individual arterial systems. They could tailor medication to reduce the pressure exactly. They could also see potential blockages in important arteries and design ways to remove or reduce them without major surgery.
Climbing Your Arterial Tree
Calculations of blood flow in the human arterial “tree” are among the scientific projects that require the enormous resources of the NSF TeraGrid.
Exact blood flow determination is the challenge taken up by Dr. George Em Karniadakis, who is not an M.D. but a professor of computational fluid mechanics at Brown University. Computational is the important word here. Karniadakis and his team are well on their way to building an accurate three-dimensional computer model of arterial blood flow that will ultimately aid doctors who want to save us from arteriosclerosis, coronary artery disease and other circulatory diseases.
Their computer model tackles the human arterial “tree” as a series of bifurcating (splitting from one branch into two) pipes extending from the heart. As the computer “heart” pumps, the simulation calculates the speed of the viscous flow as it branches and re-branches.
“Many arterial diseases begin in the places where arteries branch,” Karniadakis says, “where there is backflow or erosion of the arterial walls.”
How does his computer model work, and where does it get its power?
Modeling as a Way of Doing Science
Computer modeling, simulation and data analysis long ago joined laboratory experimentation and fieldwork in shaping and directing the course of the sciences. The Karniadakis computer model, for example, has to solve thousands of simultaneous equations to capture blood flow dynamics accurately. The first advance for Karniadakis and his group was to make the model run on a very large “supercomputer,” composed of many processors like the one in your laptop or desktop machine.
The difference between an ordinary computer like a laptop and today’s largest supercomputers is very much like the difference between a horsedrawn carriage and a jet plane, only more so. With thousands of processors, supercomputers can make trillions of calculations per second “in parallel” and then sum up the answers. Today a supercomputer can perform all the arithmetic ever done by Galileo in his lifetime in a billionth of a second.
Karniadakis needs more power than a single powerful supercomputer can deliver.
“I need to use multiple supercomputers in parallel,” he says. Only such massive, interconnected systems can support real-time, three-dimensional modeling.
Fortunately for Karniadakis and his team, and for thousands of other scientists and researchers, scientific computation is entering a new era, in which the supercomputers themselves are interconnected, through projects like the “TeraGrid,” a visionary effort funded by the National Science Foundation (NSF) in which The University of Texas at Austin plays a major role. The TeraGrid is a giant enterprise that will change the way in which research affecting society is done from now far into the future.
Why a TeraGrid?
What does it mean?
“Tera” is a Latin prefix meaning “trillion,” and it refers to the capacities of today’s largest supercomputers, which can do trillions of mathematical operations every second.
“Grid” is a concept derived from an analogy with power grids—the great, seamless electrical generation and transmission systems of every modern country.
In the context of supercomputing, a grid is an entire nexus of interconnected systems: terascale supercomputers, of course, but also computer-driven instruments (giant telescopes, particle accelerators, powerful electron microscopes) and networks of sensors that gather data from the oceans, air and earth.
Think of supercomputers as great looms on which all of the sciences taught in separate university departments for so many years are interwoven through the methodologies of modeling and simulation. Chemistry, physics, computer science and geosciences are woven together on the computational looms, and the TeraGrid is now enabling the looms themselves to communicate with each other.
The TeraGrid has grown into a national-scale initiative to deploy and operate an interconnected system of leadership-class computers that scientists and engineers can use to solve some of their most challenging problems. NSF has just awarded a set of awards to eight advanced computing centers across the nation to support and enhance the TeraGrid over the next five years. One of the centers is the Texas Advanced Computing Center (TACC) at The University of Texas at Austin. All of the centers are tightly linked over high-bandwidth networks to carry out the TeraGrid project.
The TeraGrid integrates powerful supercomputers and other equipment to provide high-capability computational services to the scientific community. The system is the nation's largest and most comprehensive distributed computational infrastructure for open scientific research.
TeraGrid research is likely to enhance research in almost every scientific discipline that requires intensive computing capabilities, from disease diagnosis to weather forecasting to discovery of the origins of the cosmos to aircraft design simulation.
“TeraGrid is a foundational building block to building a national cyberinfrastructure,” saysTeraGrid director Charlie Catlett of the University of Chicago/Argonne National Laboratory. “TACC and the other participants are collaborating to enable resources to work together, introducing successively more advanced capabilities, while providing a stable system for scientists to use today.”
Says Catlett, “TACC is one of eight partners who are providing a host of resources and services, integrated with Grid technologies and common policies. Researchers all over the country have already used more than 29 million processor-hours on the TeraGrid’s linked supercomputers. Chemists, physicists and molecular and cell biologists account for about half the usage, with another two dozen specialties accounting for the remaining half. But they are not the only beneficiaries: the TeraGrid is building ways to reach out to educators and students at all levels, and even to the general public, so everyone can explore high-end knowledge generation.”
A Paradigm Shift from Traditional High-performance Computing
The TeraGrid resource providers are linked together by the world’s fastest optical fiber network.
Many new users will join the TeraGrid through a growing set of TeraGrid science gateways that use Web services and grid technologies to provide access to TeraGrid resources through familiar Web portal and even desktop application interfaces. These gateways will enable large numbers of researchers and educators with common types of scientific problems to use the TeraGrid in ways tailored to the unique requirements of their communities.
“This is one of the most exciting things we are doing, and I believe this kind of work will vastly increase the number of people able to take advantage of these high performance computing resources,” Catlett says. “The NSF grant is about enabling science by advancing and evolving the TeraGrid system, which at the moment has about 1,000 users. We’ll grow that number to what we hope will be in the 7,000 to 10,000 range.”
“The TeraGrid Science Gateways and portals are designed for entire communities. They may conduct research and even teach in a variety of disciplines, including atmospheric sciences, astronomy, life sciences, high-energy physics and nanotechnology,” Catlett says. “Some gateways are specialized for projects centered on major scientific instruments, like Oak Ridge’s Spallation Neutron Source. Two gateways are optimized for environmental modelers and emergency response personnel.”
Role of the University in the TeraGrid
“Our participation in the TeraGrid will allow researchers from Texas and across the nation to make new theoretical and experimental advances with direct benefit to society,” says Juan Sanchez, vice president for research at The University of Texas at Austin. “The TeraGrid makes collaboration across disciplines and institutions much easier, and it helps to make sure that, whatever the problem is, the best minds are working on it.”
“TACC is proud to represent The University of Texas at Austin in working with some of the nation’s leading institutions to develop, operate and evolve the TeraGrid, the most powerful cyberinfrastructure project in the world today,” says Dr. Jay Boisseau, TACC director and the principal investigator for the university on the project.
“TACC is also coordinating the development of TeraGrid’s User Portal, which will serve as an essential access point to TeraGrid users and is pioneering some of the technologies that will be used in science gateways.”
Also supplying resources to the TeraGrid are four other university entities: the Bureau of Economic Geology, the Center for Research in Water Resources, the Center for Space Research and the High Resolution X-ray Computed Tomography Facility. They are contributing large-scale databases of important geosciences data plus one database filled with computed tomography (CT) images of biological and paleontological specimens.
Science on the TeraGrid
Using city maps and data acquired during Hurricane Allison, which hit Houston in June 2001, scientists calculated and visualized the flooding from the storm surge. The TeraGrid enables predictions of storm surge heights and flash flood dangers in real time, through the Flood Modeling Science Gateway.
TeraGrid enables researchers to analyze terabytes—trillions of bytes—of data collected by scientific instruments, telescopes, satellites and remote sensors. They can manipulate and visualize enormous data sets in novel ways, gaining new insights into research questions in basic science that are of fundamental importance to society.
Emergency Response Systems
The Flood Modeling Science Gateway centered at TACC, for example, allows scientists to predict and respond to the multiple effects of severe weather events.
The principal investigators are Dr. Gordon L. Wells of the university’s Center for Space Research and Dr. David Maidment of the Center for Research in Water Resources.
“We’re working with collaborators in emergency management throughout Texas and with other researchers located at Purdue and at Oak Ridge National Laboratory to develop real-time flash-flood prediction and response capabilities,” Wells says.
The data streams that must be combined to obtain this capacity include land elevation and watershed descriptions, weather data from satellites and radar, and data on daily population movements and transportation routes.
“We need to know when and how to get people out of the way of something like a Hurricane Katrina or Rita or a slow-moving storm like Ophelia, and we need to predict the exact effects as fast as possible,” Wells says.
Blood Flow Model
Even the extraordinarily accurate model of blood flow in the human arterial system is so large that it depends on the resources at four TeraGrid sites simultaneously: TACC, the Pittsburgh Supercomputing Center, the National Center for Supercomputing Applications in Illinois and the San Diego Supercomputer Center.
Using software technology from the Globus Alliance, with specialized tools developed at Northern Illinois University to manage the communication between computers, Karniadakis is able to spread his application across multiple computers. The total capacity of all the computers used is nearly 35 “teraflops”—trillions of calculations per second.
“We take advantage of all of that,” says Karniadakis, “and the fast network connections make it possible to synchronize and re-synchronize the calculations as necessary. Now we can investigate what happens when arterial flow is blocked at any point, as may happen in various disease processes, and we can design strategies to prevent or overcome the effects.”
TeraGrid: A Vision and a Promise
“The TeraGrid is defining a new vision of the potential offered by integrating computing resources, visualization systems, data collections, and instruments,” Boisseau says. “Scientists will tackle new challenges on the TeraGrid by developing applications that would have been impossible before now. TeraGrid will serve as an exemplar of what is possible, and The University of Texas at Austin and TACC are playing a leading role in this transformation.”