Sponsors / Partners

Accenture Microsoft NYSE Shell Salesforce AAAS Science Red Herring Fortune CNN Time Magazine
 

2004 World Technology Awards Winners & Finalists

Rick Stevens

Please describe the work that you are doing that you consider to be the most innovative and of the greatest likely long-term significance.

For the past decade I’ve been working on the problem of bringing advanced computing, collaboration and visualization tools communities of computational scientists. This work started in 1994 with the I-WAY project I led. The I-WAY laid the foundation for current Grid computing research. Grid computing is the term applied to the concept of a constructing a distributed collection of computing, data and instrumentation resources and integrating them via common software into a collective whole that can be used to provide an integrated capability to end users. Grid computing draws its name from an analogy to the concept of the power grid that is providing ubiquitous access to computing, data and visualization resources in a location independent fashion. In some cases Grids are being constructed directly by end-user communities (e.g. high-energy physics) to serve their needs for large-scale distributed data analysis. In other cases Grids are being developed and deployed by resource providers such as the National Science Foundation’s TeraGrid project I direct. TeraGrid’s aim is to provide new ways to access resources for a broad (not application specific) scientific community. Grids are being developed by industry to enable the idea of on-demand or just in time computing which could be sold to businesses as a service. Many types of Grids are possible. Some Grids are focused on the needs of distributed projects to share and analyze large quantities of data in a seamless fashion. These Grids are often refereed to as “Data Grids” because their focus is on data sharing, data replication, data curation, data movement, and data analysis. Other Grids are more focused on creating a large-scale computing resource by tying together many smaller systems into a large computing farm.

Not all Grids are focused on traditional tasks associated with computing. Some are being developed to support wide area collaboration and real-time interactions. For example the Access Grid project which I started in 1998 (see www.accessgrid.org) ¾in use at more than 500 sites worldwide ¾has developed a software system to enable the creation of shared virtual workspaces that enable the team members of distributed scientific projects to work together over high-bandwidth networks to share applications and to see and hear each other via multipoint audio and video. More importantly the Access Grid enables groups to build up over time a shared context of data and applications that is persistently maintained on servers. The Access Grid has elements of both a Data Grid and a real-time Peer-to-Peer network. The Access Grid is also being used to develop high-capability shared visualization applications that enable teams of scientists to visually analyze data streamed in real-time from high-end visualization servers.

Grids are tools for empowering scientists. Grids have the potential to further democratize science, by giving each researcher (no matter their location) the power of entire institutions. This “power” can be used to access data and the tools needed to understand and transform data into knowledge and to form and test predictions. Grid power can also be used to gain access to remote instruments that are more capable or more available than those that are locally available. Grids enable distributed teams of researchers to form virtual organizations that can act, interact and think like one integrated laboratory. Grids can be applied to the problem of community annotation, curation and sharing of biological data, through peer-to-peer systems this can be done in a way that closely matches historical patterns of direct investigator-to-investigator sample sharing and collaboration. The Open Life Science Grid project which I started in 2002, is one effort prototyping this type of peer-to-peer system.

Brief Biography

Professor Rick Stevens is director of the Mathematics and Computer Science Division at Argonne National Laboratory and director of the Computation Institute at the University of Chicago. He is internationally recognized for his work in high-performance computing, collaborative and visualization technologies, and computational science, including computational biology. He has a broad set of research interests best characterized by the idea that advanced computing and communications technology is a primary enabling tool for accelerating scientific research. His research has focused on a range of strategies for increasing the impact of computation on science, from architectures and applications for petaflops systems to Grid computing to advanced visualization and collaboration technology for improving scientific productivity of distributed teams. He is currently director of the NSF TeraGrid project and formerly chief architect of the National Computational Science Alliance. He has a long-standing interest in applying computing to problems in the life sciences and has been systematically focusing his energies in this direction during the past decade. He is professor of computer science at the University of Chicago, where he teaches and supervises graduate students in the areas of systems biology, collaboration and visualization technology, and computer architecture. He co-founded and directs the University of Chicago and Argonne Computation Institute. The Computation Institute was created to provide an intellectual home for large-scale interdisciplinary projects involving computation at the two institutions.