Skip to Content

Managing Bibliographic Citations for LTER

Printer-friendly versionPrinter-friendly version
Issue: 
Spring 2014

John Porter (VCR)

Publications are the lifeblood of science and academia in general. For this reason, keeping track of publications related to an LTER site is critical. The Virginia Coast Reserve (VCR) LTER has for some time used an aging version of EndNote. But we were curious if there was something better. So we sent out a query to LTER Information Managers asking what they were using.

First, we defined what we thought a “perfect” system would look like:

  1. Ingestion
    • automated citation ingestion given a DOI
    • easy to import citations from other sources (e.g. web of science, Google Scholar)
    • easy to cut-and-paste citations emailed to me
  2. 2) Display
    • easy to produce, up-to-date attractive web display
    • optional sorting criteria (author, title, year)
    • simple search
    • advanced search (field specific) (author, title, year, keyword)
    • filter by type
    • numbered display (to make counting easy)
    • includes place to link to PDFs of papers
    • includes place to link to associated datasets
    • clickable to display abstracts etc. (if available)
  3. Export
    • needs to be able to produce Endnote Export file for importing into the LTERNet Cross-Site Bibliography
    • individual or lists of citations to standard citation exchange formats suitable for input into other systems

Here is a summary of the responses we received:

Corinna Gries at the North Temperate Lakes LTER reports that they also use Endnote for ingestion of bibliographic data, because it exports easily to both Drupal and the LTERNet Cross-Site Bibliography. Drupal provides many of the sorting, searching and querying functions for general users to access.

Margaret O’Brien at Santa Barbara Coastal LTER reports that they use Ecological Markup Language for maintaining citations. The additionalMetadata tag contains management-related information, including related dataset ids, agency reported to, project name/code and report year, and Booleans for completeness, online status and acknowledgments,  Stylesheets handle sorting and filtering for the web, and for transformation to an EndNote export format for importing to the LTERNet Cross-Site Bibliography.   Although this is adequate, she is in the process of migrating the bibliographic data to a PostgreSQL database to better facilitate queries for NSF reports and cross-links between people, research projects and datasets.  She will keep the citation itself in EML, using the PostgreSQL XML data type, so existing XSL template can still be applied. She would definitely like to see something that could automatically export citations for Report.gov (Missed on the list of “perfect features” above)!

She also noted that it would be useful to have a discussion of various policy issues related to what constitutes a journal paper (i.e., level of peer review), and criteria for what constitutes a publication related to a site (or not).

James Connors at the Palmer and California Current LTER sites reports using EndNote for basic bibliographic management, but that they have a set of Python scripts that parse an EndNote Export file into a custom schema within their relational database. That database is then used to feed their web page.  Using a relational database lets them easily link citations with personnel and the data catalog, while maintaining the original data in Endnote facilitates updates of the Cross-Site Bibliography.

Finally, Jason Downing at the Bonanza Creek LTER site uses a custom MySQL schema to manage publications as a part of their “meta-database.”  Thus, as for James Connors, allowing bibliographic data to be easily linked to personnel and data packages.  They use a ColdFusion service to dynamically format the data for web display.  They use simple PHP-based forms to provide a web interface for input of references, with additional cleanups applied directly within the database.  They also have scripts for generating EndNote Export formats for participation in the LTERNet Cross-Site Bibliography.

As you can see, there are a wide range of approaches used within the LTER Network. However, there are some commonalities. One is that input to the bibliographic database is primarily through a single portal (entry into a single EndNote database, or directly into EML), indicating that IM or other designated project personnel are responsible for the data entry.  Presumably this is because of issues of accuracy and duplication that arise when data come in directly from investigators via web forms. This mirrors our experience at the VCR LTER, where formerly we used online forms to solicit publications from investigators, but soon found that we would get many duplicate, or worst yet, near-duplicate versions of citations accumulating in the database. The time required to clean up the citations was more than the time required to input them consistently and correctly.  Another commonality is that bibliographies needed to be transformed into different versions for different uses (e.g., web display, reporting, and inclusion in the LTERNet Cross-Site Bibliography). A variety of tools, from content management systems to
hand-written scripts were used.

At the VCR LTER we ultimately chose to make minor modifications to our old system. We updated our version of EndNote so that we could use the expanded or improved features in the new version relative to linking to PDFs, generation of XML output and citation of electronic publications.  We added a new management module to our Drupal installation that allowed us to rapidly, and completely, remove all the existing publications (one-by-one and even screen-by-screen removal of publications is unwieldy) so that a new “clean” import from EndNote could be performed. We use the unique Accession Number field to contain a “key” that can be used to link publications in the relational citation database created by Drupal with other data (e.g., personnel and datasets) that are in our relational database.