Skip to Content

Information Management at NTL: History, Evolution, and Insight

Printer-friendly versionPrinter-friendly version
Issue: 
Spring 2006

- Mark Servilla (LNO)

Long Term Dynamics of Lakes in the Landscape (J. J. Magnuson, T. K. Kratz, and B. J. Benson, editors) Oxford Press, New York. 2006.

The recent 2006 publication of Long-Term Dynamics of Lakes in the Landscape Long-Term Ecological Research on North Temperate Lakes by Oxford University Press includes a chapter on "Breaking the Data Barrier: Research Facilitation through Information Management." The chapter is organized around three central questions:

  1. "How have information technology and management changed the way we do science?"
  2. "What have been the pivotal decisions and principles that have shaped information management?"
  3. "What challenges does the future appear to hold for information management at North Temperate Lakes LTER?"

Analysis of the first question sets the motivation for solid information management beginning with an anecdotal experience where a study using historic research data (circa. 1930) on lake pH had to deal with the lack of documentation for analytical methods. Recreating the methodology required access to an oral history provided by members of the early research team - not the preferred method of documentation! The need for standards for data and metadata, coupled with the rapid advances in computer technology since 1980 (the start of NTL), led to the development of an information management infrastructure that promoted long-term legacy of information, with support for interdisciplinary research among NTL and other LTER researchers. The analysis also presents a broader vision for a network-wide information system that provides centralized discovery and access to distributed data by utilizing the latest tools that take advantage of semantic information embedded in metadata

The second question explores key decisions made with respect to NTL's information management philosophy. At the crux of this issue, NTL principal investigators decided from the beginning that core data will be collected and managed centrally - that is, the use of central resources and site-wide standards for data collection, documentation, and management were implemented at the start of the NTL LTER project. This strategy reinforced the development of information technology at NTL by defining two primary goals for the design of their information system:

  1. "To create a powerful and accessible environment for the retrieval of information that facilitates linkages among diverse data sets"
  2. "To maintain database integrity."

To achieve these goals, NTL utilized off-the-shelf software whenever possible to ensure industry standards and quality, while mitigating dependencies on tailor-made or in-house solutions, which are often rendered useless when the development team is no longer available for support. To streamline data management, NTL also is developing an automated QA/QC process that will significantly reduce the manual review of data from the sensor network. NTL recognized early on the need for an information management infrastructure that integrates computational science with field and laboratory based ecology. This is especially true, as information management at NTL employed both database and Internet technology for data archiving and access, and the on-going collection of geospatial data required extensive storage solutions and expertise in both GIS applications and remote sensing imagery analysis. As such, NTL required a dual-role information manager - part ecologist and part information technologist.

The third question addresses the future challenges of information management at NTL. Specifically, NTL faces both an increase in data volume from continuous monitoring of buoyed sensors and the vast array of Earth Observation imagery sensors, as well as new types of data that are generated by the social scientists who are now recognized as fulfilling an important role in long-term ecological research. These diversified data holdings, together with other distributed data sets, will act as the raw inputs for synthetically derived products that are only now being considered. For this reason, NTL sees the importance of educating new and emerging information managers through cross-disciplinary programs that can provide the necessary foundation to link information technology with ecological sciences.

From the challenges at the inception of the LTER program to a perspective on the future, this chapter provides a critical and historical review of the developmental process of the NTL information management infrastructure and can be viewed as a testament to the benefits of a planned information management system. As the Lead Scientist for the LTER Network Information System, I highly recommend this chapter as a "Good Read".