Skip Over Navigation LinksHome  |  Site Map  |  Contact Us   
  Search Tips 
Biomedical Technology Clinical Research Comparative Medicine Research Infrastructure
Go to the National Institutes of Health Web site  
Research Funding Opportunities
Access to Scientific Resources
About NCRR
News and Events
Publications, Plans, and Reports
Spacer GIF Spacer GIF Spacer GIF
Back to > Previous Page
 
Data and Collaboratories in the Biomedical Research Community
September 16-18, 2002
Arlington, Virginia

Supported by:
National Center for Research Resources
Bethesda, Maryland

Hosted by:
National Biomedical Computation Resource
University of California, San Diego

Location:
Alliance Center for Collaboration, Education, Science, and Software
Ballston Metro Center Office Tower
901 North Stuart Street, Suite 800
Arlington, VA 22203

Preface

This meeting was supported through a National Center for Research Resources (NCRR)—part of the National Institutes of Health (NIH) and the Department of Health and Human Services (DHHS)—supplement (NIH award RR08605) to the National Biomedical Computation Resource at the University of California, San Diego (UCSD). UCSD gratefully acknowledges the contributions of the panelists as well as agency staff from NIH, the National Science Foundation (NSF), and the Department of Energy (DOE). In addition, we extend our thanks to the staff and management of the Alliance Center for Collaboration Education Science and Software (ACCESS) for allowing us to host this meeting in their facility.

Executive Summary

Today is a time of great opportunity as biomedical research continues to harness trends in information technology, the grid, and emerging cyberinfrastructure. These trends empower teamwork and collaborative approaches that support large-scale and information-rich biomedical investigations, resulting in new insights into basic biological processes associated with health. In particular, trends in the biomedical research community include collaborative, data-driven scientific experiments, from capture through refinement to dissemination, and data discovery via integration from across spatial and temporal scales. Recognizing the importance of these trends, NCRR convened a workshop of experts to 1) review the state of prior multiagency investments in collaboratories; and 2) examine future directions of collaboratories and related technologies with a particular focus on the data-rich environment of biomedical research.

Workshop participants concluded that the time was right for a new round of investments in collaboratories focused on data and communities of scientists. One important and recurring opportunity of such collaboratories was the ability to produce cross-scale discoveries about disease processes by integrating data from distributed and diverse resources. Furthermore, such a data focus would: enable cross-disciplinary work; allow scientific activities to scale to a natural level (unlimited by physical constraints); promote greater data integration and access; build stronger research communities; and broaden the engagement in science. Collectively, these advances will drive the necessarily cross-scale, digitally enabled genomic and translational medicine of the future.

The workshop participants strongly encourage NCRR to build on its prior experience, and that of the broader collaboratory community, in order to sustain leadership in developing and maintaining shared infrastructure and technologies for enhanced collaborations in the biomedical research community. New collaboratory projects will face issues and need to overcome challenges involving technology, culture, and evaluation in order to reap the envisioned outcomes.

The workshop participants endorsed several programmatic themes around the following concepts. First, NIH must continue to develop and support novel collaborative projects in communities of biomedical research. Second, NIH must work with other agencies to create the fundamental tools and cyberinfrastructure that will enable collaboratories focused on data and instruments, moving beyond many early collaboratory efforts that mainly emphasized access to remote instruments. Third, data resources and tools need sustained support, maintenance, and commitment by NIH to ensure the full realization of the collaboratory concept. Fourth, human resource development is needed to support and cultivate expertise spanning biomedical, computer, and information science disciplines as well as to create new career tracks for researchers and professionals that span multiple disciplines, such as natural science, engineering, and social science. Finally, creation of successful new generation collaboratories and related technologies will require broad and systematic evaluation to guide development and to record the impact on the organization and conduct of science.

1.  Introduction

Over the past nine years, NIH, along with other Federal agencies—notably the National Science Foundation (NSF) and the Department of Energy (DOE)—has supported development of network-based virtual laboratories, following the recommendations of the 1993 National Research Council (NRC) report: National Collaboratories: Applying Information Technology for Scientific Research. As described in the NRC report, collaboratories are expected to improve the speed and output of scientific research through Internet access to tools, data, and colleagues—independent of time and place. In practice, most early collaboratory projects focused on remote control of distant equipment with less emphasis on data and colleagues (Finholt, 2002). Changes since 1993 in both the character of research, particularly in the biomedical community, and in underlying computer and network technologies suggest an opportunity to update and elaborate the original collaboratory concept. Increased attention must be directed to the end-to-end flow of data from acquisition to deposition in a data repository, and from data reuse to integration across various repositories and databases. Specifically, the time is right for a new round of collaboratory investment that will: deal with the unprecedented volume of data generated by biomedical researchers; broaden, via the grid, access anywhere to the growing availability and power of on-line research and training tools; produce cross-scale discoveries about disease processes; increase the throughput of useful data product generation and derivation; and accelerate the translation of basic research findings into clinical practice.
    1.1.  The Time Is Right

    Increasingly Collaborative Nature of Biomedical Research

    During discussions, the panel strongly emphasized the trend toward higher levels of collaboration in biomedical research. This trend reflects two imperatives in the organization of biomedical science. First, that progress in the understanding of fundamental disease processes demands cross-scale data. That is, results must be obtained and integrated at the molecular, cellular, and organism levels. To accomplish this requires collaboration across laboratories that specialize in these different kinds of data—such as beamlines used to obtain protein structures, electron microscopy used to reveal cellular structure-function relationships, and magnetic resonance imaging used to view organisms. Second, productive use of cross-scale data requires corresponding expertise from specialists trained in the acquisition and interpretation of these data. Therefore collaboration is driven both by the need to share data and to share knowledge about data.

    Growing Importance of Data and Capacity to Produce Data

    The panel noted that science in general, and biomedical science in particular, is being reorganized around the free availability and flow of data at unprecedented volumes and detail. This in part reflects the observation, described above, that research discoveries demand cross-scale data. However, an equally strong force driving increased data flow is the dramatic growth in the capacity to produce and transport data. For example, the human genome project illustrates the power of automation, in the form of computer-controlled gene sequencers, to expand and accelerate data acquisition. At the same time, the cost of storing and moving large amounts of data is falling rapidly, making data storage and exchange economically feasible as well. The capacity to produce and transfer greater amounts of biomedical data is occurring in a historical context that has stressed discipline-specific approaches to data generation and movement.

    For biomedical researchers, information technology has played and will continue to play a key role in standardizing access to data across multiple laboratories and inquiries. For example, the Protein Data Bank (PDB) represents a centralized approach by which investigators deposit three-dimensional macromolecular structures (initially as produced by crystallographic techniques and later expanded to include Nuclear Magnetic Resonance [NMR]-produced structures). From a modest enterprise established in the mid-1970s, the PDB has grown into a critical international resource operated by a consortium of institutions in the U.S. (Rutgers State University of New Jersey, San Diego Supercomputer Center at the University of California San Diego, and the National Institute of Standards and Technology) with mirror sites in Europe and Asia. The PDB currently contains over 12,000 structures, and the main PDB web site generates between 60,000 and a 100,000 visits per day. The Biomedical Informatics Research Network (BIRN), recently funded by NCRR, represents a distributed approach by which local databases are federated, via hardware and software, into larger-scale resources. The initial BIRN implementation is in the area of neuroimaging. In this instantiation of BIRN, more than ten sites will join their data to tackle neuroimaging studies of greater scope and complexity and with greater geographic sampling than would be possible by operating independently of one another.

    A key consequence of centralized or distributed data repositories is the emergence of new research that relies heavily on data from these resources. For example, as the volume of data increases and as data resolution improves, it becomes possible to derive secondary data products—such as simulation results—that have critical scientific, clinical, and economic significance (e.g., the use of visualizations made from structures in the PDB to improve drug discovery). Further, as network bandwidth increases, it becomes possible to use collected data to improve the performance of instruments gathering new data. For instance, directing real-time instrument operation using the results of computational simulations to improve the yield of meaningful data enhances electron microscopy.

    Emerging Cyberinfrastructure Initiatives

    Tremendous changes in underlying information technology for collaboratories have occurred in the nine years since the publication of National Collaboratories: Applying Information Technology for Scientific Research. Specifically, computing capacity has increased 64-fold (following Moore's law), storage has increased 512-fold, and network bandwidth has increased 4,096-fold (following Stix, Scientific American, January 2001). Together, these trends allow for integration of data, instruments, scientists, networks, software, and high-performance computing that are collectively characterized as "cyberinfrastructure." The importance of cyberinfrastructure for researchers is indicated both by the NSF's recent blue-ribbon commission on cyberinfrastructure (Atkins, et al., 2002) and by the proliferation of projects in the U.S.A., Europe, and Asia that are attempting to build tools to produce cyberinfrastructure (e.g., the Globus Project, the National Middleware Initiative, Access Grid, and the semantic web). As indicated above, a key requirement of successful cyberinfrastructure will be the ability to locate and move massive amounts of data.

    In particular, a main function of cyberinfrastructure will be to knit together disparate communities of expertise and practice required to generate breakthroughs in the understanding of disease processes and subsequent development of effective clinical approaches. That will be made possible if: 1) investigations are increasingly cross-scale and data intensive; 2) expertise to produce and interpret these cross-scale data is geographically distributed, and 3) data storage itself is dispersed. Then, short of massive co-location, scientists will demand effective means for conducting research at a distance. This means researchers will need network-based tools for quickly and transparently locating and accessing data as well as applications for manipulating data, such as visualization and simulation codes. Similarly, researchers will need support for communicating about data and results—not just via publications and conferences—but also through informal interactions that do not depend on physical proximity (e.g., the virtual equivalent of unplanned hallway encounters). Requirements for richer interaction via the Internet, such as through continuous high-resolution video links, quickly exceed the bandwidth of most existing local network connections (i.e., "the last mile" problem) and may also require people to adjust their common ways of working and communicating.

    1.2.  Opportunities Around Data

    Scaling to a Natural Level of Scientific Activity

    An important opportunity produced by the development and use of cyberinfrastructure is the elimination of arbitrary limits on the scope and scale of research activity. Prior to the Internet, projects faced hard constraints on size, typically determined by the availability of physical space to house collaborators and equipment. Modern communication technologies, such as the phone and e-mail, and modern transportation have overcome most space constraints. Similarly, media such as e-mail may lack sufficient richness to build and sustain levels of interpersonal trust required for successful collaboration. Collaboratories, by augmenting and virtually bridging shared physical space (while preserving the functionality of colocation—e.g., presence awareness), open the possibility for larger collaborations. These virtual collaborations are expected to be more diverse, with the hope that joining scientists from multiple disciplines with varied backgrounds, expertise, and data will increase the pace and quality of research.

    Integrating and Accessing Data

    The panel strongly emphasized the importance of emerging cyberinfrastructure, such as computational and data grids, as enabling conditions for broader data integration and access in the biomedical research community. Specifically, BIRN was highlighted as an example of a project enabled through the creative use of cyberinfrastructure. The panel enthusiastically endorsed federating smaller databases into larger-scale systems, such as BIRN seems to do, as a first step toward more efficient and transparent data sharing. Specifically, BIRN was viewed as a prototype form of a "biogrid." A biogrid includes focal resources, such as a specific data repository, surrounded by a rich environment where researchers can collaboratively visualize their data, link data from other resources, and insert data into working computational models of the cell or of multicellular units. An important characteristic of the biogrid vision is that it encompass the entire process of the data flow, from acquisition through processing and refinement, to deposition, curation and—ultimately—query, discovery, and publication (with advantages for diverse number of users including researchers, students, clinicians, and drug developers).

    Building Research Communities

    Cross-scale data integration greatly increases the potential for collaboratories to join disparate research communities. This form of the collaboratory vision represents the most ambitious aspect of the description in the original NRC collaboratory report and in other early papers on collaboratories (e.g., Wulf's 1993 article in Science). In the initial conception, collaboratories were framed as mechanisms to optimize access to scarce resources, such as unique instruments. There is a parallel to the early motivation for the Advanced Research Projects Agency Network (ARPANET), a network was conceived primarily to distribute access to costly computer cycles. However, the key application of the ARPANET became communication in the form of e-mail. Similarly, collaboratories may be evolving into virtual locations where members of a community of practice can go to meet and collaborate with colleagues, as well as to access facilities and data. Enhancing the community aspect of collaboratories may offer huge rewards, much as the earlier widespread adoption of e-mail paid huge dividends for scientists in terms of information sharing, data exchange, and community building.

    Furthermore, research communities, which are not limited by national boundaries, are able to interact with colleagues from across the world in collaboratories, adding the potential to interact with the best minds on the planet and to leverage more resources for science. Just as the Internet gave us a global post office, a global shopping mall, a global library, and a global university (Friedman), the grid will allow the view of a collaborative electronic science (e-science, Taylor), e-commerce, and e-health.

    Broadening Engagement in Science

    A critical by-product of conducting more science via the Internet is the increased potential for participation beyond researchers at mainstream institutions. For example, a number of collaboratory projects have demonstrated the value of collaboratories for outreach to K-12 audiences, such as the Bugscope system at the University of Illinois, which enables classrooms to reserve time on a scanning electron microscope to view insect specimens over the web. At the undergraduate level, collaboratories can be used to introduce students to authentic research experiences at an earlier stage, providing exposure to data and instruments that may be unavailable on a typical undergraduate campus. Collaboratories may have special appeal to faculty and students at colleges and universities that have scarce research resources, both in terms of equipment and activity. Finally, for graduate students, collaboratories may provide a flexible means to obtain needed data and instrument time, as well as a mechanism for diversifying collegial networks. For instance, in space physics, collaboratory use has allowed first-year students to engage immediately with experienced scientists in collecting data, compared to the more traditional model in which students collected data on their own and later in their graduate careers.

    Other applications of collaboratories involve interactions between scientists and policy makers (e.g., Centers for Disease Control) when there are issues of importance to public health.

    1.3.  Lessons from Prior Collaboratory Investment

    General Experience

    The Science of Collaboratories project at the University of Michigan has identified over 70 collaboratory projects launched since 1993. These projects fall roughly into five categories. First, the greatest proportion of projects focused on instrument sharing. For example, the NSF-funded Upper Atmospheric Research Collaboratory linked an international community of several hundred space physicists to instruments at an observatory in Greenland. Similarly, the DOE-funded the Environmental Molecular Sciences Laboratory (EMSL) collaboratory at the Pacific Northwest National Laboratory allows remote use of NMR machines by distant researchers. Second, a significant number of projects used collaboratory technology to create virtual research centers, such as the NIH-funded Great Lakes Regional Center for AIDS Research, which joins clinicians and scientists at Northwestern, Minnesota, Wisconsin, and Michigan. Third, some of the most visible projects were community data systems, such as the PDB and BIRN projects described earlier. Fourth, many collaboratories have been used as virtual learning facilities, such as the Collaborative Visualization (CoVis) project at Northwestern. Finally, a number of collaboratory efforts can be described as oriented to product development, such as the use of collaboratories for software engineering within Bell Labs.

    A critical challenge for early collaboratory developers was developing tools for teleoperation of remote instruments. As early as 1992, the Collaboratory for Microscopic Digital Anatomy (a joint NSF- and NCRR-funded laboratory at UCSD) demonstrated successful operation of an intermediate-voltage-transmission electron microscope in San Diego by operators in Chicago. In 1993, the Upper Atmospheric Research Collaboratory (UARC)—an NSF-funded project at the University of Michigan—demonstrated the first non-biomedical instance of teleoperation to control all-sky imagers and interferometers in Greenland from various locations in North America and Europe. Since these early demonstrations, teleoperation has become a routine element of instrument operation for many laboratories. For example, while early testbeds used custom-developed software, many instances of teleoperation can now be supported with commercial applications, such as Microsoft NetMeeting and Polycom videoconferencing units, or through public domain applications such as Virtual Network Computing (VNC) and the Access Grid.

    While not yet universal, most technical barriers to teleoperation are understood and have solutions—assuming cooperation from instrument manufacturers. Today, with modest effort, most interested laboratories can support remote or automatic control of instruments. That is, following a philosophy at the Stanford Synchrotron Lab: If a device is computer controlled within an integrated systems architecture, then this creates the possibility for a completely new mode of data-centric operation by which instrument instructions are executed by "smart" systems (i.e., in response to data being collected).

    The frontier for collaboratory development, then, has moved to address the broader aspects of the original collaboratory vision, such as data integration, data access, and tools to support collaboration with data. For example, the UARC system described above has evolved from a primary focus on real-time data gathering using remote facilities to a collaboration environment used largely to build views from multiple data sources over an interesting data interval.

    Implementing the capabilities implied by the broad collaboratory vision will mean attention to an additional set of requirements. First, because data are generated from procedures or sources that may be unknown or invisible (i.e., not monitored by a person—such as automatic data collection), there is a new need for assurance about data integrity and validity. Second, the increased emphasis on automated data and secondary use of data leads to a much greater effort to define and implement metadata formats (i.e., the data about data—such as who collected the data, calibration values, etc.). Third, because by-products of data analysis can have significant economic and intellectual value, much greater attention must be placed on implementing security, such as access control and verification, as well as on authentication (i.e., assuring that a given data file contains what it is claimed to contain, and that its contents have not been altered). Finally, because data analysis and interpretation demands a much higher level of interaction among collaborators, tools to support collaboration at a distance need to meet expectations for scientific discourse, which are often formed from experience in colocated settings. For example, there is a higher demand in virtual environments to convey information that helps two distant collaborators successfully orient their attention to the same observation or region of data, since cues such as pointing and direction of gaze may be absent. Specifically, the most elegant remote control and data transfer capabilities will be wasted if collaboratories and related technologies do not allow convenient and normal human interaction. The importance of supporting normal interaction at a distance, such as for management and coordination of geographically distributed projects, is only likely to increase with the increased emphasis on projects that span institutions (e.g., NIH's recently announced "regional centers of excellence" for biodefense).

    NCRR Experience

    In the mid 1990's NCRR hosted a community workshop that considered the question of how to better connect distributed research being supported by NIH via new technologies. A panel of experts considered collaborative technologies as a vehicle to achieve this end. Based on that panel's recommendations, NIH initiated a request for supplement proposals to existing Research Resources (or P41 centers) to experiment with the new technology in explicit settings. Seven awards (see Appendix D) were made. Information about those awards can be found in the National Biomedical Collaboratories Workshop Report. This report summarizes the experiences learned and successes of those initial investments. The last year of support for those awards is 2002.

    Overall, the NCRR collaboratory program has benefited from the early experience with collaboratories funded by the DOE and NSF, which successfully demonstrated remote instrument control. The NCRR supplements have enabled a broader collaboratory vision, notably the creation of more complete collaboratory environments that combine features such as remote control with the data collaboration features discussed earlier. Examples of these more complete collaboratories are the University of California, San Francisco (UCSF) project, which includes development of tools for shared remote viewing of molecular and structural visualizations, and the University of Illinois at Urbana-Champaign (UIUC) project, which offers access to a broad range of local and remote technologies such as supercomputer centers, discipline-specific and general tools, data, and visualization solutions. More information on the entire NCRR collaboratory supplement program can be found in Appendix D.

    Components of a New Round of Collaboratory Investment

    The panel highlighted four key capabilities that must be present in a successful new round of collaboratory investments. These capabilities are: 1) communications and resource control, including teleoperation and teleobservation; 2) information sharing, including the creation and curation of data repositories, security and authentication controls, and tools for collaborative visualization and analysis of data; 3) coordination, including planning experiments and computer runs (e.g., scheduling scarce resources); and 4) technology development, including continued elaboration of hardware and software, particularly to incorporate collaboration to ensure that new collaboration technologies are compatible with emerging middleware standards (e.g., the Globus toolkits).

2.  Summary of Recommendations
    2.1.  Goals of New Investment

    Accelerate Distributed and Cross-disciplinary Science

    The panel agreed that accelerated movement toward more distributed and cross-disciplinary biomedical research should be a basic goal of new collaboratory investment. Collaboratories and related technology can play a critical role in this movement by facilitating data integration, increasing awareness of related work, and reducing barriers to communication between laboratories. The panel does not argue, however, that collaboratories, alone, can produce more cross-cutting research. As noted below, collaborative tools are only one element among many that can influence progress toward greater levels of cross-disciplinary work; culture and incentives are among other key factors.

    Foster Interactions to Accomplish Larger Studies Than Can Be Carried Out Within a Single Laboratory

    As noted earlier, the panel speculated that collaboratories may provide a mechanism to expand research activity to an ideal scale—in terms of people and equipment—rather than an arbitrary scale imposed by constraints of physical space. Collaboratory projects should be selected to probe this hypothesis, with the caveat that the scale of scientific collaborations may involve many limitations other than available space (e.g., limits of effective coordination, attentional limits of individual participants in a collaboration)—some of which may not be affected by collaboratory use.

    Fund Infrastructure That Supports Cross-scale Work

    Endorsing the broad concept of cyberinfrastructure, the panel remarked that emerging and new scientific work practices (such as cross-scale data integration) will naturally require new kinds of infrastructure. Some of this infrastructure will be in nontraditional forms, including software versus instruments or buildings. Specifically, a new round of collaboratory investment should push the biomedical research community to adopt emerging open-source standards for "middleware," or the software that lies between user-level applications and low-level network protocols (e.g., security, resource discovery/access, and data discovery/access—as realized in the NSF National Middleware Initiative).

    Within this concept of cyberinfrastructure is the notion of data resources. Data resources are an important form of supporting infrastructure to advance science (see Science and Engineering Infrastructure for the 21st Century) and collaboratories. However, this critical new form of infrastructure requires different funding and review mechanisms (compared to those to fund and evaluate research) to ensure availability, persistence, and stability of these resources.

    Preserve Vital Data and Tools

    The panel noted that as data and data tools become more central to biomedical investigations, preserving data and data tools becomes more important. A goal of additional collaboratory investment should be demonstration of data archiving and curation that assures the integrity and persistence of data over time. The panel also highlighted the concept of the "vitality" of data and tools, meaning that resources and applications should be updated to maintain their usefulness.

    Broaden and Diversify Participation from Institutions and the Community

    Because collaboratories may potentially reduce barriers to participation in biomedical research, a goal of new investment should be selection of collaboratory projects to explore the value and impact of diverse engagement in projects, spanning different kinds of institutions and individuals–-particularly those institutions (e.g., resource-poor universities and colleges) and individuals (e.g., women and under-represented minorities) typically not included in front-line biomedical research. Examples of this might include mentorship of undergraduates at four-year colleges and universities by investigators at research universities, or continuing participation by faculty who have initiative collaborative work (e.g., during a sabbatical) when they return to their home institutions.

    Conduct Evaluations to Gain Systemic Understanding

    The panel commended the evaluation components in each of the first round collaboratory supplements. However, the panel felt that coordinated evaluation efforts would significantly strengthen collaboratory impact and success. Specifically through coordination, the variation in level of effort will be reduced and will make it simpler and easier to produce direct comparisons across projects. Often, program managers want to review impact based on a standard and complementary set of outcome metrics, such as research output or quality of research, and process criteria, such as numbers of users and types of use. Subsequent collaboratory investment should include a systemic evaluation component to ensure that collaboratory impact can be assessed against common and complementary sets of outcome and process measures; it should also include innovative evaluation methods and measures that address limitations of current methods and measures.

    2.2.  Challenges

    The preceding sections have alluded to the panel's concerns about technological and cultural challenges that must be either overcome or accommodated to achieve success in a new round of collaboratory funding.

    Technology

      Integration of Data Across Multiple Scales/Instruments

      The panel identified cross-scale research as an important opportunity to be exploited through further collaboratory development. However, the panel also recognized several barriers to successful cross-scale data integration. First, data from multiple instruments will likely be produced in multiple formats. Agreements about how to interoperate between these formats, and which tools to use to accomplish this interoperation, are a prerequisite for developing higher-level applications (such as data visualizations) that can span several levels of investigation (i.e., allow a user to zoom out from the molecular level to the cellular level). Second, metadata descriptions must be sufficiently robust to allow nonspecialists to use data outside of their domain without violating assumptions underlying the data (e.g., applying data out of context). Third, there must be conventions to show the validity of data, particularly when nonspecialists may lack the expertise to evaluate the quality of a particular data source.

      Uniform, Secure Access to Heterogeneous Distributed Information

      The panel noted that unfolding efforts to develop cyberinfrastucture, such as computational and data grids, are ambitious. However, they are also in a very early stage of development. Specifically, toolkits for developing grid applications have only recently been released in production versions (e.g., the Open Grid Software Architecture). As a result, many of the claims made for these middleware environments—including their ability to create secure, heterogeneous, and distributed data environments—are unproven in field settings. While there is strong evidence that the grid movement is gaining momentum, such as endorsement by various corporate players, the newness of the grid-based architectures introduces a level of risk that must be addressed in forming plans for grid-centric collaboratories.

      Coordination of Middleware Development

      As middleware initiatives grow and move forward, there is a danger that unique needs and concerns of biomedical researchers will not be met in the face of larger and less idiosyncratic applications (e.g., business applications). Wulf, in his early (1993) call for collaboratory development, cited the potential incompatibility between research requirements and commercial requirements as a primary motivation for launching projects tailored to scientific collaboration, and not collaboration in general. As open source efforts, the best way to influence the trajectory of middleware initiatives is to fund biomedical researchers and laboratories to do middleware development. This will simultaneously develop middleware programmers with biomedical backgrounds and ensure that special concerns of biomedical researchers are brought to the broader development community surrounding the middleware projects.

      Preservation of Data Integrity Over Time

      As noted earlier, with the trend toward greater reliance on data, there is an urgent need to formulate metadata formats and curation guidelines to ensure that data survives over time. The tremendous advantage of paper as an archival medium, for example, is its extraordinary durability over time and its resistance to decay and corruption. By contrast, electronic data records created less than a decade ago (e.g., 5.5-in. floppy disks) are often unreadable on contemporary systems.

      Discovery of Data and Resources

      As data are increasingly used beyond the boundaries of individual laboratories, there is a greater need for mechanisms to ensure that needed data can be discovered and retrieved by researchers. In federated data models, like BIRN, users must have uniform and timely access to distributed data.

    Cultural and Behavioral

    Beyond technological challenges, there are many cultural and behavioral challenges that must be overcome to conduct successful collaboratory-based research.

      Create an Arena for Combined Work Across Biomedical, Computer, and Information Science

      A significant effort must be made to develop more arenas for cross-fertilization between computer and information science research and domain science disciplines, such as the models that have emerged at the University of Michigan's School of Information and elsewhere. For example, this combined work may arise through identification of novel testbeds that push improved capabilities (e.g., increased bandwidth), leading to new application areas. It will also be important to identify biomedical researchers who have training in computer and information science, as well as computer and information scientists who have training in knowledge management and biomedicine. These individuals, spanning disciplines, will be in a better position to translate concepts and requirements between the biomedical and computer science communities.

      Create Behavior of Sharing as a Default

      The crystallographic community has developed, in conjunction with the funding agencies and publishers, a process by which data are deposited in the Protein Data Bank. This fosters open access and allows for use and reuse of data by other scientists and will ultimately benefit comparative cross-species analysis useful in drug design. BIRN, described earlier, will be addressing data access within a community that has not had a similar tradition of data sharing and reuse. Such projects need to be monitored, to see how they address issues of publication and attribution, and encouraged to be able to take advantage of the full vision of a collaboratory.

      The panel notes the new policy position of NIH on data sharing as a necessary step towards changing behavior needed to achieve the potential of data integrating collaboratories.

    Evaluation

      Integral Component of System Development (Quality and Quantity)

      The panel identified a central role for evaluation in collaboratory development. First, the panel highlighted the value of systems analysts and usability evaluators working closely with software developers. For example, this kind of close interaction can accelerate the delivery of prototype applications and the gathering of feedback from users. In spiral development models, user feedback can focus effort in subsequent iterations of development, ensuring that applications converge quickly on implementations that satisfy high-priority user requirements. Second, the panel recognized the need for a broad evaluation approach to assess the impact of new technologies on the organization and conduct of science, as well as on the evolution of collaboratory-based communities. For instance, this kind of evaluation would address outcome measures, such as papers and discoveries, and correlate these measures with collaboratory features and levels of collaboratory use, and perhaps collaboratory culture and overall work environment. In both forms of evaluation, the panel endorsed a multi-method approach combining both qualitative and quantitative techniques.

      Mutlilevel - Community, Group, Individual

      Effective evaluation will require analysis at several levels. First, the broader kind of evaluation described above will depend on data collected at the community and group levels. For instance, changes in patterns of cross-citation and coauthorship can be correlated with collaboratory use to show shifting patterns in cross-disciplinary work, or even the emergence of new disciplines. Second, usability evaluation will require data gathered mostly at the individual and group levels. For instance, user interface design can be improved by conducting laboratory tests, while the impact and performance of collaborative tools can be analyzed through case studies of groups using collaboratories.

      Support for Systemic Insights - Coordination Across Projects

      The panel strongly emphasized the need for similar and coordinated collaboratory evaluation. That is, to draw meaningful conclusions about the impact and value of collaboratories will require comparison across different research settings using complementary and innovative measures and methods. This would likely provide sufficient sample size and variation to reveal novel trends and potentially revolutionary research practices.

    Other Issues

    For collaboratories to reach their fullest potential, the many facets of data access will need to be addressed. In addition to the technical and cultural/behavioral issues above, there are three other interrelated factors that need to be considered. First, legal and policy issues are especially important in international collaborations. For example, policy issues in the health sciences often focus on privacy rules, such as the confidentiality of patient records, where regional differences in regulations (e.g., the U.S. versus the European Union) may have implications for what data can be shared in a collaboratory. Additionally, in light of the events of September 11, 2001, restrictions on access to data will increase (see Science, 20 September 2002). Second, institutional and managerial approaches to collaboratories need to reflect the fact that trying to integrate heterogeneous data may require flexible approaches. For instance, this flexibility may be particularly important when integrating data from different disciplines, as recognized in the draft NIH Statement on Sharing Research Data. Finally, from a budgetary perspective, data should be viewed as an integral part of science and community data resources, either centralized or federated, and it should be considered to be a critical form of infrastructure. Agencies should be prepared to carry the costs of developing and maintaining this infrastructure.

    2.3.  Requested Investments

    Data-driven Experimentation

    Reflecting the panel's belief about the increasing importance of data-driven research, key investments will be needed in the following areas. First, at the data creation stage, applications must support uniform mechanisms for annotating and validating data. Second, at the archiving stage, support must exist for storing both intermediate and final results, with corresponding access control, authentication, and verification. Finally, at the dissemination stage, tools and infrastructure must be capable of supporting sophisticated visualization and analysis—along with data and resource discovery.

    Data-driven Discovery Via Integration

    The success of data-driven science will demand efficient and effective mechanisms for federating data across multiple sites—and the tools to curate and retrieve these data. Beyond this level of infrastructure integration, there will need to be support to achieve integration across different research approaches and perspectives, such as producing comprehensive models of cardiac function, combining chemical, electrical, and mechanical models of the heart. Finally, as noted frequently in the preceding sections, mechanisms must be developed to achieve integration across broad spatial and temporal scales, spanning molecular-scale phenomena to whole organisms. This will involve creating uniform querying systems and developing new approaches in areas such as semantic mediation.

    Next Steps

    The panel endorsed a number of programmatic themes around the following concepts. First, NIH must continue their leadership in developing and supporting novel collaboratory projects, such as the recent launch of BIRN. Attributes of this investment should be federation of data across multiple laboratories, integration across data scales, and support for distributed data discovery and analysis. Second, NIH must join other agencies, both in the United States and abroad, in supporting the creation of fundamental cyberinfrastructure that will enable a new generation of collaboratories. Specifically, the expanding computational and data grid activity will require involvement by biomedical researchers to ensure that grid technology meets the unique requirements of the biomedical research community. Third, NIH must be prepared to commit to the support and maintenance of critical data resources and tools over long periods, including ongoing research and development to preserve data integrity and ensure the vitality of data analysis tools. Fourth, the growing importance of cyberinfrastructure to the conduct of biomedical research requires the cultivation of interdisciplinary researchers who are comfortable spanning the biomedical and computer sciences. This may require different emphasis in the training of biomedical researchers as well as the creation and support of new kinds of career tracks, such as bioinformatics professionals. Finally, activity to develop data-centric collaboratories and related technologies must be accompanied by broad and specific systematic evaluation (of collaboratories and the network of collaboratories)—both to guide the development of the collaboratories and to identify new ways of conducting and organizing science within the context of collaboratories.

 

 

References:

Atkins, D.E., Droegemeier, K.K., Feldman, S.I., Garcia-Molina, H., Klein, M.L., Messina, P., Messerschmitt, D.G., Ostriker, J.P., & Wright, M.H. (n.d.). Revolutionizing Science and Eengineering through Cyberinfrastructure: Report of the National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure, Draft 1.0. Retrieved November 11, 2002 from the National Science Foundation, Directorate for Computer and Information Science and Engineering, Advisory Committee for Cyberinfrastructure site.

Biomedical Informatics Research Network (BIRN). (n.d.). Retrieved November 11, 2002.

Berman, H.M., Westbrook, J., Feng, Z., Gilliland, G., Bhat, T.N., Weissig, H., Shindyalov, I.N., & Bourne, P.E. (2000). The protein data bank. Nucleic Acids Research, 28, 235-242.

Finholt, T.A. (2002). Collaboratories. In B. Cronin (Ed.), Annual Review of Information Science and Technology, Volume 36 (pp. 73-108). Medford, NJ: American Society for Information Science and Technology/Information Today, Inc.

Foster, I., Kesselman, C., Nick, J.M., & Tuecke, S. (2002). Grid services for distributed system integration. Computer, 36, 37-46.

Friedman, Thomas. The Lexus and the Olive Tree: Understanding Globalization. Farrar, Straus and Grioux. 2000.

The Globus Project. (n.d.). Retrieved November 11, 2002.

NCRR Biomedical Collaboratories Workshop Report. (n.d.). Retrieved November 11, 2002.

NIH Draft Statement on Sharing Research Data. (n.d.). Retrieved November 11, 2002.

National Research Council (1993). National Collaboratories: Applying Information Technology for Scientific Research. Washington, D.C.: National Academy Press.

National Science Board. Science and Engineering Infrastructure For the 21st Century: The Role of the National Science Foundation [NSB 02-190].

National Science Foundation Middleware Initiative (NMI). (n.d.). Retrieved November 11, 2002.

OECD Follow-up Group On Issues of Access to Publicly Funded Research Data: Interim Report. (n.d.). Retrieved November 11, 2002.

NAS Censors Report on Agricultural Threats: Science 20 Sept 2002, 1973-1975.

Stix, Gary. Triumph of the Light. Scientific American, 284, 80-86.

Taylor, John, e-Science Core Programme: Welcome. (n.d.). Retrieved November 11, 2002.

Wulf, W.A. (1993). The Collaboratory Opportunity. Science, 261, 854-855.

 

 

Appendix A: Charge to Workshop


The charge to the workshop participants was:

To review the state and future directions of collaboratories and associated technologies, with a particular focus of data flow from capture to deposition and use out of data resources.

    The workshop will be focused on biomedical applications, involve members of cyberinfrastructure (grid) and other information technologies, as well as biomedical and clinical research communities.

To produce a report and provide it to the National Center for Research Resources (NCRR).

    The report will include opportunities for NCRR to consider in collaboratories that will make effective use of current investments, and extend those investments to the broader biomedical community.

In particular, the workshop and report will address the following three questions:

  1. What are the opportunities for using a collaboratory approach to deal with the problems associated with data in the biomedical community? What could such research catalyze?
  2. What could the biomedical community expect from another round of collaboratory investment by NCRR? (The time frame is 5 years.)
  3. What should be the goals of a collaboratory program? How should a collaboratory program be evaluated? Potential evaluation metrics could include:
    1. A measure of how small laboratories participate with larger laboratories in data sharing.
    2. A discussion of how to measure/ensure the availability of the products of a collaboratory program.
    3. Methods to ensure that a database of a certain size, with a projected growth rate, is available to the community. What funding is available for such database maintenance programs?

     

 

Appendix B: Agenda


Data and Collaboratories in the Biomedical Research Community
September 16-18, 2002
Alliance Center for Collaboration, Education, Science, and Software
Ballston Metro Center Office Tower
901 North Stuart Street, Suite 800
Arlington, VA 22203
Supported by NCRR


Monday, September 16, 2002

8:00 a.m. - 8:30 a.m.
    Continental Breakfast

8:30 a.m. - 10:15 a.m.
    Opening Session

    View from NCRR: Michael Marron and other NCRR Staff (15 Minutes)
    Introductions by Participants (30 Minutes)
    Experiences with Collaboratories: Tom Finholt (30 Minutes)
    Overview of Meeting: Peter Arzberger and Tom Finholt (15 Minutes)
    Discussion (15 Minutes)

10:15 a.m. - 10:30 a..m.
    Break

10:30 a.m. - 11:30 a.m.
    Collaboratory Experiences, Part 1

    The objective of these talks is to present vignettes of how the collaboratory enabled science that would not have happened otherwise, indicate where technology was not available that inhibited development, and suggest where collaboratories focused around data flow and resources would enhance science.

    Experiences at PNL: James Myers (20 Minutes)
    Experiences in a Clinical Setting: Steven Wolinsky (20 Minutes)
    Q&A; (20 Minutes)

11:30 a.m. - 12:45 p.m.
    Lunch

12:45 p.m. - 1:45 p.m.
    Collaboratory Experiences, Part 2: NCRR Collaboratories

    The objective of this panel discussion is to review, on a case-by-case basis (no more than 10 minutes/3 slides per project), the focus, the specific aims, and the results of NCRR-supported collaboratories. After the presentations, the discussion will address problems encountered, what collaboratories would do differently, and the evaluation criteria used by the projects.

    • Resource for Biocomputing, Visualization, and Informatics: Tom Ferrin
    • Interactive Graphics for Molecular Studies and Microscopy: Dianne Sonnenwald
    • High Performance Computing for Biomedical Research: David Deerfield
    • Macromolecular Modeling and Bioinformatics: Gila Budescu
    • National Center for Microscopy and Imaging Research: Mark Ellisman

1:45 p.m. - 4:15 p.m.
    Policy and Social Implications

    Data Sharing at NIH: What Should Be Considered Within the Collaboratory: Wendy Baldwin (20 Minutes)
    Evaluation: Gila Budescu (20 Minutes)
    Social Informatics: Geoffrey Bowker (20 Minutes)
    Q&A; (30 minutes)

4:15 p.m. - 4:30 p.m.
    Break

4:30 p.m. - 5:45 p.m.
    Key Data Projects:

    The objective of this group of presentations is to discuss type of activities around data, and outline how collaboratory technologies could enhance these facilities and enhance the science conducts using these facilities.

    Protein Data Bank: John Westbrook (20 Minutes)
    SLAC: Peter Kuhn (20 Minutes)
    Biomedical Informatics Research Network: Mark Ellisman (20 Minutes)
    Q&A; (15 minutes)

5:45 p.m. - 6:00 p.m.
    Wrap-up and Overview of Day 2: Peter Arzberger and Thomas Finholt

6:00 p.m.
    Dinner


Tuesday, September 17, 2002

8:00 a.m. - 8:30 a.m.
    Continental Breakfast

8:30 a.m. - 10:00 a.m.
    Technical Trends: Part 1

    Over the next five years, what are the opportunities and limitations? What does the future hold, relative to collaborations around data resources and flows?

    Cyberinfrastructure/Digital Library: Dan Atkins (20 Minutes - phone)
    Grid: Computing and Data: Terry Disz (20 Minutes )
    Grid: Path Between Grid and Web Services: Ian Foster (20 Minutes - vtc)
    Data: Information Integration Technologies: Chaitan Baru (20 Minutes)
    Q&A; (10 Minutes)

10:00 a.m. - 10:15 a.m.
    Break

10:15 a.m. - 11:50 a.m.
    Technical Trends: Part 2

    The Internet2 Commons H.323 Video Conferencing Service - What It Is and How to Use It: Bob Dixon (20 Minutes - vtc)
    Networking: Rick McMullen (20 Minutes)
    Collaborative Tools: Jonathan Grudin (20 Minutes)
    Semantic Web: James Hendler (20 Minutes)
    Q&A;: (15 Minutes)

12:00 noon - 1:00 p.m.
    Lunch

1:00 p.m. - 1:30 p.m.
    Overview and Instructions for Breakout Sessions

    Envisioned are three parallel sessions, to address the following issues:

    1. What are the opportunities for using a collaboratory approach to deal with the problems associated with data in the biomedical community? What could such research catalyze?
    2. What could the biomedical community expect from another round of collaboratory investment by NCRR? (The time frame is 5 years.)
    3. What should be the goals of a collaboratory program? How should a collaboratory program be evaluated? Potential evaluation metrics could include:
      1. A measure of how small laboratories participate with larger laboratories in data sharing.
      2. A discussion of how to measure/ensure the availability of the products of a collaboratory program.
      3. Methods to ensure that a database of a certain size, with a projected growth rate, is available to the community. What funding is available for such database maintenance programs?

    Each group will be asked to present its findings the following day in PowerPoint or equivalent format.

1:30 p.m. - 4:30 p.m.
    Working Groups

4:30 p.m. - 5:00 p.m.
    Preliminary Reports from Working Groups (problems, barriers, issues)
5:00 p.m.
    Dinner


Wednesday, September 18, 2002

8:30 a.m. - 9:00 a.m.
    Continental Breakfast

9:00 a.m. - 11:00 a.m.
    Full Feedback from Breakout Groups (30 minutes each) and Discussion

11:00 a.m. - 11:15 a.m.
    Break

11:15 a.m. - 12:00 noon
    Summary of Overall Findings

12:00 noon
    Close of General Meeting

12:00 noon - 5:00 p.m.
    Writing Group Convenes

 

 

Appendix C: List of Participants


Peter Arzberger, National Biomedical Computation Resource, UCSD

Tom Finholt, School of Information, University of Michigan

Dan Atkins, University of Michigan (phone)

Wendy Baldwin, Office of the Director, NIH

Chaitan Baru, San Diego Supercomputer Center, UCSD

Geoffrey Bowker, Department of Communications, UCSD

Gila Budescu, Resource for Macromolecular Modeling and Bioinformatics, University of Illinois

David Deerfield, High Performance Computing for Biomedical Research, PSC

Terry Disz, Argonne National Laboratory

Bob Dixon, Ohio State University (September 16, 18 - vtc)

Mark Ellisman, National Center for Microscopy and Imaging Research, UCSD

Tom Ferrin, Resource for Biocomputing, Visualization, and Informatics, UCSF

Ian Foster, Argonne National Laboratory and University of Chicago (vtc)

Jonathan Grudin, Microsoft

Ted Hanss, Internet2 (vtc)

James Hender, University of Maryland (September 17-18)

Peter Kuhn, SLAC, Stanford

Donald F. (Rick) McMullen, Indiana University

James Myers, PNL

Steve Peltier, University of California San Diego (September 18)

Ralph Roskies, High Performance Computing for Biomedical Research, PSC (September 17-18)

Dianne Sonnenwald, Interactive Graphics for Molecular Studies and Microscopy, UNC

John Westbrook, PDB, Rutgers State University of New Jersey

Steven Wolinsky, Feinberg School of Medicine, Northwestern University

 

 

Appendix D: List of current NCRR Collaboratory Awards


The material comes from several sources, one of which is the NCRR Biomedical Collaboratories Workshop Report.

Seven P41 Supplements

UCSD: This project extended earlier work on remote control of electron microscopes by incorporating computational and data grid elements. For example, by adding computational grid capabilities, microscope operators can combine real-time viewing with simultaneous results from computational simulations (e.g., complex visualizations produced on supercomputers).

UNC: This project extended earlier work on haptic, or force feedback interfaces to an atomic force microscope (called the NanoManipulator) by building a collaborative interface to the NanoManipulator. That is, through computer control of the probe on the atomic force microscope, users can "feel" nano-scale structures, such as the resistance of the boundary of a viral envelope or the stickiness of blood clotting factors. A highlight of this project has been the rigorous experimental investigation of user experiences, showing that the quality of scientific work done remotely with the NanoManipulator does not differ from work done in a shared physical setting.

Stanford: This project has focused on teleobservation and teleoperation of experiments conducted on the beamlines at the Stanford Synchrotron Radiation Laboratory.

Wisconsin: This project has focused on scheduling applications to improve throughput on an array of NMR machines.

UCSF: This project has extended earlier work on the Chimera application to make it collaborative. Chimera is a tool for creating graphical visualizations of molecules. The collaborative version incorporates mechanisms for shared pointing to recreate important features of colocated collaboration in a virtual context.

UIUC: This project has focused on the development of the Biological Collaborative Research Environment (BioCoRE), a collaborative work environment for biomedical research, research management, and training. BioCoRE offers scientists, working together or alone, a seamless interface to a broad range of local and remote technologies such as supercomputer centers, discipline-specific and general tools, data, and visualization solutions.

PSC: This project has explored the use of collaborative tools to support distributed software development, such as NetMeeting sessions to conduct distributed code reviews, CVS to manage revision control, and PSC's web browser to visualize remote data sets, and developed DocShare to provide a shared virtual workspace to distributed users on diverse platforms via standard web browsers.


BioCoRE: A Collaboratory for Structural Biology, University of Illinois, Urbana-Champaign

Bhandarkar, Milind; Budescu, Gila; Humphrey, William F.; Izaguirre, Jesus A.; Izrailev, Sergei; Kalé, Laxmikant V.; Kosztin, Dorina; Molnar, Ferenc; Phillips, James C.; and Schulten, Klaus. "BioCoRE: A Collaboratory for Structural Biology." In Bruzzone, Agostino G.; Uchrmacher, Adelinde; and Page, Ernest H., editors, Proceedings of the SCS International Conference on Web-Based Modeling and Simulation, pp. 242-251, San Francisco, California, 1999.

Additional information can be found at: http://www.ks.uiuc.edu/Research/biocore.


Collaborative NanoManipulator, University of North Carolina at Chapel Hill

Sonnenwald, D.H.; Bergquist, R.; Maglaughlin, K.A.; Kupstas-Soo, E.; and Whitton, M. (2001). Designing to support collaborative scientific research across distances: The NanoManipulator example. In Churchill, E.; Snowdon, D.; Munro, A. (Eds.), Collaborative Virtual Environments (pp. 202-224). London: Springer Verlag.

Sonnenwald, D.H.; Maglaughlin, K.L.; Whitton, M. C. (June, 2001). Using innovation diffusion theory to guide collaboration technology evaluation: Work in progress. IEEE 10th International Workshops on Enabling Technologies Infrastructure for Collaborative Enterprises (WET ICE). NY: IEEE Press. 7 manuscript pages.

Hudson, T.; Sonnenwald, D.H.; Maglauhlin, K.; Whitton, M.C.; Bergquist, R. (2000). Enabling Distributed Collaborative Science. ACM 2000 Conference on Computer Supported Cooperative Work: Video Program.

Additional information can be found at: http://www.cs.unc.edu/Research/nano/index.html.


Collaboratories for Biomedical Research Software Development, Pittsburgh Supercomputing Center

Natrajan, A.; Crowley, M.; Wilkins-Diehr, N.; Humphrey, M. A.; Fox, A. D.; Grimshaw, A. S.; Brooks, C. L., III, Studying Protein Folding on the Grid: Experiences Using Chemistry at HARvard Molecular Mechanics (CHARMM) on National Partnership for Advanced Computational Infrastructure (NPACI) Resources Under Legion , 10th High-Performance Distributed Computing (HPDC), Keynote Speech , August 2001.

Gadd, C. (2002). Supporting Loose Confederacies of Collaboration: Experiences from Computational Biology and 3-D Imaging Collaboratories. Proceedings of the American Medical Informatics Association 2002 Annual Symposium, San Antonio, Texas.

Durka-Pelok, G.; Pomerantz, S.; Gadd, C.; Weymouth, T.; Gest, T.; Huang, J.; Nave, D.; Wetzel, A.; Lee, S.; and Athey, B. (2002). Evaluation of a Volume Browser: PSC-VB. Fourth Visible Human Conference, Keystone, Colorado.

Wetzel, A.W.; Pomerantz, S.M.; Nave, D.; Kar, A.; Sommerfield, J.; Mathis, M.; Deerfield, D.W., II; Bookstein, F.L.; Green, W.D.; Ade, A.; and Athey, B.D. (2002). A Networked Environment for Interactively Viewing and Manipulating Visible Human Data Sets. Fourth Visible Human Conference, Keystone, Colorado.

Additional information can be found at: http://collaboratory.psc.edu/.


Microstructure Image-based Collaboratory, San Diego Supercomputing Center

Additional information can be found at: http://ncmir.ucsd.edu/MIBC/.


3-D Image Visualization & Manipulation Collaboratory, University of California, San Francisco

Additional information can be found at: http://www.cgl.ucsf.edu/Research/collaboratory.


Collaboratory Testbed for Macromolecular Crystallography, Stanford Synchrotron Radiation Laboratory

Additional information can be found at: http://smb.slac.stanford.edu.


High-Resolution Biological NMR Spectroscopy, University of Wisconsin

Zolnai, Zs.; Lee, P.T.; Westler, W.M.; Volkman, B.F.; Livny, M.; and Markley, J.L., "Synchrotron-light for Experimental Science and Applications in the Middle East (SESAME) - an Experiment Management System for NMR Spectrometers," Frontiers of NMR in Molecular Biology, Breckenridge, Colorado, January 9-15, 1999.

Lee, P.T.; Westler, W.M.; Chapman, M.R.; Markley, J.L.; and Zolnai, Zs., "SESAME - an Experiment Management System for NMR Spectrometers," 40th Experimental NMR Conference, Orlando, Florida, February 28 - March 5, 1999.

Lee, P.T.; Li, J.; Chapman, M.R.; Volkman, B.F.; Chae, Y.K.; Markley, J.L.; and Zolnai, Zs., "SESAME - an Experiment Management System for NMR Spectrometers," 41st Experimental NMR Conference, Asilomar, California, April 9-14, 2000.

Additional information can be found at: http://kamba.nmrfam.wisc.edu/Sesame/

 

 

For further information, contact:

Director, Division for Biomedical Technology Research and Research Resources
National Center for Research Resources
National Institutes of Health
One Democracy Plaza, Room 962
6701 Democracy Boulevard, MSC 4874
Bethesda, Maryland 20892-4874
Telephone: 301-435-0755
FAX: 301-480-3659
e-mail: BTADIR@mail.nih.gov

 

[Biomedical Technology | Clinical Research | Comparative Medicine | Research Infrastructure]
[Home | Accessibility Compliance | Contact Us | Disclaimer | FOIA | Privacy | Site Map]
National Center for Research Resources (NCRR)
National Institutes of Health
One Democracy Plaza, 9th Floor
6701 Democracy Boulevard, MSC 4874
Bethesda, MD 20892-4874
NIH Logo National Institutes of Health (NIH)
Bethesda, Maryland 20892
DHHS Logo Department of Health
and Human Services
Go to FirstGov Web Site