Skip to Content

Spring 1992

Spring is here, and so is Databits!

This issue features news from the sites, new ways to access LTERNET, some handy data manipulation utilities, a description of a GPS campaign and much more.

Featured Articles


cdrlib : a library of functions for scientific data manipulations.

Scientific data requires a lot of transformations before it is presented as paper with figures, tables and text. Each of these elements is obtained using one or more software applications. A library of C functions ( cdrlib ) was developed at Cedar Creek to expediate this task and also make data management for multi-investigator research projects easier. A survey was conducted in 1991 to see which operating systems and applications are widely used in LTER sites. The results showed that at most sites, scientists use at least two operating systems and 2 or 3 applications to present their data in a publishable format. The library cdrlib may prove useful in these settings especially in a network environment.

The problem resides in the fact that no single software or hardware offers the complete and unique solution for data management, data analysis and data presentation. Usually, a significant part of LTER researchers are more comfortable with one software application than another, but they find themselves in need of sharing data that may have been generated with an application they are not familiar with and may also have been generated on a different hardware platform.

Some researchers may need to alternate between several applications before reaching the final product. For example: a user who is familiar with the SAS package on an IBM PC may need to learn some UNIX commands in order to convert files to extract and manipulate his data on the Sun. A user who has Cricket Graph (Mac application) files, who wishes to run more elaborate statistics on SAS will need to convert the files into ASCII, and go through a number of steps to create SAS data files. Once the data is tabulated, he may need to graphically present his results using yet another package such as statview. To complicate this example further, imagine that a site had to switch from one database to another and from one computer architecture to another in search of better hardware and software to meet their increased needs.

All of these steps are inefficient and are prone to errors. Writing new utilities that will convert data from one format to another will make it easier for the scientist to analyze the data and use other researchers' intermediate files. Cdrlib can be advantageous when transfering files from one site to another, or when researchers use different applications for their data. Cdrlib can also convert one database to another with minimal programming. Here are a few functions and test drivers that users may find very helpful:

Reformat:

----------

A utility that reformats ASCII files without programming. It's like a small version of awk but is tailored for our data.

Example : input file "f1" contains the following data:

1 2.56 up 2
33 45 down 12
44 1 left 234
...

This can be reformatted in many ways:

reformat -i f1 -o f2 -f "%2d|%6.1f|%5s| name1 20 %3d"

produces a new file "f2" that has the following content:

1| 2.6|up | name1 20 2
33| 45.0|down | name1 20 2
44| 1.0|left | name1 20 234

...

Note that the value of 2.56 was rounded to 2.6 because we specified %6.1f. Also note that any additional characters can be added to the file without resorting to a word processor or a programming language, a feature that some SQL users may find appealing. If no format string was passed, reformat simply finds the data types ( integers, reals, etc. ) then reformats the data according to the maximum field widths in all tuples.

This utility may be useful for users who are interested in converting data so it can be put in a format expected by a stub- born application. It can add separators to a file that is being converted into a database table.

Header files:

-----------------

A function that documents ASCII files and reformats them in a way so that documentation and data are in a one file (such as Walt Conley explained at the DM meeting in Snow bird, Utah 1990), is an important function since most cases we have the original raw data in flat ASCII files. This data can be automatically documented and put into our final format by a call to one of the following utilities:

make_hdr -i f1 -o f2 -l "plot biomass side nonsense key
stage" -d dictionary.89 -r docfile

This will take files f1, dictionary.89 and docfile as inputs (where dictionary.89 is a data dictionary and docfile is a documentation file) and outputs file "f2" that has the following format:

## File : f2 :
##
## Column1: plot : experimental unit (2mx2m) : int(%3d):
## Column2: biomass: Root biomass (g/m2) : float(%7.2f):
## Column3: side : Side of Marker : char(%5s):
## Column4: nonsens: No such thing : char(%5s):
## Column5: key : Plant key ID : int(%3d):
## Column6: stage : Plant Stage code : int(%4d):
##
## Other info:
##
## data:
##
1 2.56 up name1 20 2
33 45.00 down name1 20 2
44 1.00 left name1 20 234
...

There are utilities that will take files in this format or any other format such as the one for file "f1" we started with, and convert them to Statview, Cricket Graph, DIF (Data Interchange Format), SAS, INFORMIX files, etc. SAS does not support DIF files on Sun's, so this method may be used to move large sets of data between applications or simply to add documentation to ASCII files.

Using these utilities, from shell scripts on UNIX systems or batch files on DOS systems can decrease the amount of time required to port or reshape data sets. Nonprogrammers can use the utilities directly. Programmers can link the functions from the library to their own applications, or modify them to suit their needs.

These utilities are bundled in a library (cdrlib) that is available through anonymous ftp. If you need more information or have any suggestions please send email to: aelhaddi@lternet.washington.edu.

-- A. Elhaddi and T. Nguyen, Institute of Technology and Cedar Creek LTER : University of Minnesota

Optical Disk Management on a Net

Read-write optical disks are a great way to stretch your disk space, but can cause some complications when they are accessed by several different computers. We operate our Sun computers as a network. Up to six computers may have access to our disks at any time. For the "fixed" disks, this is no problem. We simply mount all the disks using the Network File System (NFS). For the optical disks, it is a problem for two reasons. The first is that we would like individual users to be able to change the disk, but normally only 'root' is allowed to mount or unmount disks. Secondly, its important that a disk not be removed while it is in use by another computer, but UNIX does not communicate to the host computer information about the status of NFS mounts.

To solve these problems, we wrote a set of C programs that takes care of the mounting and unmounting of our Pinnacle optical disk, provides access to the disk by all users and protects against abrupt disk changes. The structure of the C programs is simple. They use the 'setuid' function to set the user to 'root' then run a shell script to do the actual manipulations. Why not use just the shell script? Basically, because shell scripts that have 'root' priviledges set can constitute a security hole. After compilation we use the command 'chmod 4755 mountopt' as root to give the executable file the proper permisions. Here is the program for mounting a disk (note: in all program listings lines have been wrapped to fit within the margins. All things in " should be on the same (long!) line):

#include <stdio.h>
main()
setuid(0);
execl ("/bin/sh", "sh", "-c",
"echo 'mounting on delmarva';rsh delmarva /usr/etc/mount -o
nosuid /dev/reo0c /opt; rsh delmarva /usr/etc/exportfs -a;echo 'mounting
on amazon';rsh amazon
/delmarva.usr/evsc/optical/reo/local/client/mountopt;echo 'mounting on
atlantic';rsh atlantic
/delmarva.usr/evsc/optical/reo/local/client/mountopt;",(char*)0);}

In the above example, the workstation "delmarva" is where the optical disk is physically connected. "amazon" and "atlantic" are client machines that use the optical disk via NFS. All the programs are stored under /delmarva.usr/evsc/optical/reo, which is on a conventional disk NFS-mounted on all machines. The "client" mountopt program does the mounting on individual workstations using NFS:

#include <stdio.h>
main()
{ execl ("/bin/sh", "sh", "-c",
"/usr/etc/mount -t nfs -o nosuid delmarva.evsc.Virginia.EDU:/opt
/opt ",(char*)0); }

Unmounting is more complicated because the disk status on each workstation must be checked. This is done using a file "umount.out" that stores the messages from individual attempts to unmount the disk. If the word "busy" shows up in any of them, the disk is remounted on all machines. Here is the "master" umountopt program:

#include <stdio.h>
main()
{ setuid(0);
execl ("/bin/sh", "sh", "-c",
"rsh delmarva 'echo
>/delmarva.usr/evsc/optical/reo/local/umount.out;chmod 766
/delmarva.usr/evsc/optical/reo/local/umount.out';echo 'unmounting
remote systems'; echo 'amazon';rsh amazon
/delmarva.usr/evsc/optical/reo/local/client/umountopt; echo 'atlantic';rsh
atlantic /delmarva.usr/evsc/optical/reo/local/client/umountopt; ifok=‘grep
busy /delmarva.usr/evsc/optical/reo/local/umount.out‘; if [ -z \"$ifok\" ];
then echo 'unmounting from delmarva';rsh delmarva
/delmarva.usr/evsc/optical/reo/local/client/umountopt; fi; ifok=‘grep busy
/delmarva.usr/evsc/optical/reo/local/umount.out‘; if [ -z \"$ifok\" ]; then
echo 'Done unmounting. You may remove the optical platter'; fi; if [ -n
\"$ifok\" ]; then echo 'Optical disk in use. Could not unmount. Try again
later.';echo ' ';echo 'Remounting optical
disk';/delmarva.usr/evsc/optical/reo/Bin/mountopt; fi",(char*)0); }

and here is the "client" version of the program.

#include <stdio.h>
main()
{ execl ("/bin/sh", "sh", "-c",
"/usr/etc/umount /opt 2>&1 |tee -a
/delmarva.usr/evsc/optical/reo/local/umount.out",(char*)0) ; }

These programs are far from elegant, but seem to do the job. Our users are able to change disks themselves, and can't mess up other users by doing so.

-- John Porter, Virginia Coast Reserve LTER

GIS (GPS?) Corner

Tides and Satellites: Using a high-resolution Global Positioning System

The Virginia Coast Reserve LTER is nearing the end of the month-long data collection phase of its first Global Positioning System campaign. It is an ambitious campaign using navigational satellites and the LTER network's highresolution GPS receivers to create a network of 50 horizontal and vertical benchmarks over an area of 250 square kilometers. Such a network is important to the VCR/LTER because it will allow us to relate study sites on different islands or the mainland sites to one another. This is especially critical for vertical control because so much of our site is subject to inundation by tides and storms.

During the course of our campaign, we've learned a lot about what goes into successful GPS work -- some of it the hard way! The most critical aspects we found are: planning, outreach, personnel and ongoing assessment.

Planning

Planning a campaign sounds simple, but in a tidally dominated site, it is not! Things that must be considered are: hours of satellite availability, accessibility and geometry of sites, personnel availability, logistics, condition of batteries and weather. Some of these can be considered in advance. Others (particularly weather) require that a high level of flexibility be retained.

Our planning began with compiling a list of sites of interest to our researchers. These sites were then prioritized and plotted on a map. We added numerous previously-existing benchmarks so that our data could be tied in to a larger (existing) framework. Where pre-existing benchmarks did not exist (about 1/2 our sites), new benchmarks were created by driving a stainless steel rod into the ground until refusal (typically 10-15 m in this sandy substrate) and protecting it with concrete around a PVC pipe. The monuments were constructed with the vertical rod independent of the horizontal control (cement kickblock). The PVC sleeve allows the cement to literally slide down the vertical rod so as when elevation of the surrounding sand drops because of erosion the weight of the cement will not put pressure on the rod and entice it to "fall" - thus losing vertical control.

We initially started with a fixed schedule that projected several weeks into the future. This was abandoned almost immediately. Bad weather, poor satellite reception, changes in the condition of the satellite network due to maintenance and equipment failures all did their part to hopelessly doom the initial schedule. Instead we've moved to formulating a long-range schedule that emphasizes prioritization of sites coupled with a flexible short-range schedule. Several SAS programs were written to produce an overall schedule and individual receiver worksheets (detailing station names, start and stop times) from a centralized database that could be changed at a moments notice if logistics demanded abandoning our initial plans.

Developing consistent names for sites also became a priority. Initially we started with 4-letter IDs for each site, with a longer, more descriptive name as its site name. However, as we quickly learned, different parts of the Trimble GPS processing software use the 4-letter ID and some use the longer name. The long names tended to cause problems -- input was unwieldy into the GPS receivers and errors (omitted spaces, variant spellings) were all too common.

The 4-letter IDs also caused some problems. Computed baselines were named based on the last two letters of each of the component stations and keeping these unique posed a problem. Ultimately, we shifted to a scheme where a unique site number (0001 - 0050) was used as the 4-letter ID and our initial 4-letter site code was used as the longer site name. This system worked quite well and greatly reduced problems in post-processing the data.

Outreach

The VCR/LTER GPS campaign has generated significant interest in local governmental agencies. The Virginia Marine Resources Commission (VMRC) was so interested that they donated the services of a trained surveyor for the entire month-long campaign. Hank Badger, the surveyor in question, has been an invaluable addition, both because of his detailed knowledge of existing local monuments and his wealth of knowledge on surveying in general. In addition VMRC arranged to have several additional GPS units made available to us for a week. Additional units greatly speed the development of a GPS network because the number of baselines generated is a multiplicative function. Three units can generate three baselines per session. Five units can generate 10 baselines in that same session! VMRC also cooperated with us on hiring a locally-based GPS consultant to help with the planning and processing of the GPS data. Additionally, the local NOAA office contributed several hundred feet of stainless steel rods for benchmarks and the tools to drive them. UNAVCO also provided excellent support on a national level, sending Barb Perrin to us for several days of training at the start of the campaign and providing advice and material support.

Personnel

Ideally, we should have a stable staff of 10 for conducting the campaign, consisting of a survey director, a data manager/processor, an equipment supervisor, a boat operator and two people to work with each GPS unit. Unfortunately, the stable staff consists of 4-1/2, with usually two other "short-time" volunteers. This creates significant problems with fatigue and makes planning and standardization of procedures difficult. We have tried to compensate for the lack of time to hold daily coordination sessions by producing the individual receiver worksheets mentioned above. These are especially critical during times of intensive work where up to three sessions may be scheduled in a single day (or should that be night -- our window of satellite availability extends from 7 PM to noon!).

Ongoing Assessment

We have found that a rapid assessment of each day's results is a critical element of our campaign. Just because a receiver was set up properly at a site and collected data for an appropriate period is no guarantee that the data will be "good." Some aspects of the GPS system are beyond our control. Atmospheric disturbances, local barriers to satellite reception, changes in the amount of errors purposely introduced by the Department of Defense as part of "selective availability" and Murphy's Law (as cited in the Trimble documentation!) can all lead to data with significant problems. Processing the data on a daily basis gives us a way to detect these problems and go back and reoccupy "problem" station. Required is a good fast PC (386/33 or better) with a numeric coprocessor and lots of disk space (we anticipate a total of 75 MB will be taken up by the raw data generated by this campaign) and a knowledgeable operator. The latter is especially important because the Trimble software has lots of processing options and the manuals are good at explaining "how," but are less good at explaining "why."

We found our GPS consultant to be the key to getting over this hurdle. He was able to suggest options that would have taken weeks to ferret out of the manuals.

-- John Porter and Randy Carlson, Virginia Coast LTER

Accessing LTERnet through Sprintnet

LTERnet's connection to the national Sprintnet (formerly Telenet) phone access system has been operational for several months and has been used by LTERnet users from various U.S. locations.

For those LTER members who are away from their home machines or who cannot otherwise get access to LTERnet and the Internet, improved connectivity is now only a local phone call away from hundreds of locations in the U.S. and abroad. This gives you access to the entire national Internet (Telnet, FTP, etc., from LTERnet) as well as the information stored on-line at LTERnet (Personnel Directory, Data Catalog, Data-bits, etc.).

To use LTERnet's Sprintnet connection, you need an account on LTERnet. Request your account by sending mail to ‘‘helper@LTERnet.Washington.edu.''

Before you can connect to LTERnet via SprintNet, you need to know the local telephone number for SprintNet. In deciding what number is best to use, you also need to consider at what baud rate your modem will work. All major metropolitan areas in the US have a local telephone number to receive 300, 1200 and usually 2400 bits per second (bps) connections. Many metropolitan areas also have telephone numbers which are capable of receiving 9600 bps connections.

There are four ways in which one can find out what the local SprintNet number is:

  1. Obtain a copy of the US Sprint publication, "Worldwide Access Directory", and look up the number in your area. (Call US Sprint telemarketing at 800-736-1130, or send email to ‘‘helper@LTERnet.washington.edu'' and ask that a copy of this brochure be mailed.)
  2. Look at the file ‘‘~ftp/pub/sprintnet/numbers'' after signing on to LTERnet for a current list of all SprintNet local telephone numbers. The file can also be copied via anonymous ftp from LTERnet. It is in the directory ‘‘pub/sprintnet/numbers.''
  3. Connect to the local SprintNet number and enter the command ‘‘mail'' followed by ‘‘phones'' and ‘‘phones'' to the prompts ‘‘User Name?'' and ‘‘Password?''. You will then be asked a number of questions which will help determine the best SprintNet number for any desired location.
  4. Call US Sprint telemarketing at 800-736-1130. Your computer's communications software should be set so that it sends seven bits with even parity and one stop bit.

Other settings may work and even be necessary at certain times, but these settings will work best under most circumstances. If possible (and it almost always is), it is usually best to have the terminal emulator emulate a Dec VT102 terminal or, if not that, a VT100.

With the local SprintNet number determined, and the computer's communications program running with the proper settings, the local SprintNet number can be called.

If the modem is Hayes compatible, it should accept the command ATDT followed by the SprintNet telephone number and a carriage return to call the local SprintNet number. Some terminal emulation packages, such as Kermit, require that the connect command be entered before sending the ATDT command to the modem to dial the number.

For example, in order to call the Seattle SprintNet number, 623-9951, one would enter the command:

ATDT6239951c where c is a carriage return.

After the your modem calls the number and SprintNet answers, SprintNet tries to figure out at what baud rate it should communicate. If your computer is communicating at 1200 bps, then send two consecutive carriages returns to inform SprintNet that communication should be established at that rate. If your computer is communicating at either 2400 or 9600 bps, then send a ‘‘@'' followed by a carriage return to inform SprintNet that communication should continue at one of these higher speeds.

SprintNet will ask, in very general terms, what sort of terminal you have: It will display TERMINAL=. Your response should be D1 followed by a carriage return for a personal computer or crt terminal.

After this point you are connected to SprintNet. You will be connected until you explicitly disconnect.

Until a connection to LTERnet is made, every command is prompted with the commercial at sign - ‘‘@.'' Normally, you would give only two commands to SprintNet: The connect command and the command instructing SprintNet to hang up the telephone line.

The command used to connect to LTERnet is the letter ‘‘C'' (upper or lower case), followed by a space, LTERnet's SprintNet address (206455) and a carriage return. Specifically, assuming ‘‘^'' stands for a space and ‘‘c'' stands for a carriage return, the following command would connect to LTERnet: C^206455c

In response, LTERnet should display its login prompt on the screen. At this point, you would simply enter your login name and password and proceed with your work.

After you have completed work on LTERnet and logged out, you will be back at the SprintNet prompt, ‘‘@''. Now disconnect from Sprintnet by entering the ‘‘HANGUP'' command followed by a carriage return. SprintNet will then promptly hang up the telephone.

For Further Information contact Daniel Pommert at the LTER Network Office, 206-543-1135, dPommert@LTERnet.Washington.edu (Internet), or ‘‘dPommert@LTERnet'' (Bitnet).

Meetings and Reports

1992 LTER Data Manager's Workshop Scheduled --

The LTER Data Manager's Workshop is scheduled for August 8-9 at East-West Centre in Honolulu Hawaii. Rooms have been reserved for August 7-9 and can be optionally extended to include the Ecological Society of America meeting that follows.

Resource Managment Meeting Held --

A workshop on "Improving Natural Resource Managment Through Monitoring" was held March 10-11 at Oregon State University. Items of special interest to data manager’s were talks on "Data, data everywhere, but not a byte to read: Managing monitoring information" presented by Susan Stafford and talks on QA/QC, statistical approaches to monitoring.

Electronic Data Report Released --

A report "The Management of Electronically Collected Data Within the Long-Term Ecological Research (LTER) Program" was published by the LTER Network Office. Rick Ingersoll and Scott Chapal did an amazing job tabulating and summarizing virtually every aspect of how electronic data (mostly from meterological stations) were collected. Tables present information on variables measured, loggers used, probes and peripherals, problems encountered, processing hardware and software, storage formats and quality assurance/quality control procedures.

Recommendations included:

  1. That the evaluation of electronic data collection within LTER be considered and ongoing and regular process,
  2. That the report be distributed to every data manager with instructions to make its contents known to personnel involved in electronic data collection
  3. That the "Standardized Meterological Measurements for LTER" needs to be updated and modified
  4. That the new "Standardized Meterological Measurements" document be made available in ASCII format and online
  5. That LTER should become active in the provision of feedback to the manufacturers of electronic data collection hardware and software.

Workshop Report Published

The long-awaited report from the 1990 workshop on data management at biological field stations and marine laboratories has been published.

Contact John Gorentz (JGORENTZ@LTERNET.WASHINGTON.EDU) for details.

From the Sites

BNZ -- Everyone's attention has been focused on writing the LTER renewal proposal. With this over.... Phyllis Adams and I are working on moving the whole climate database from the PC to the SUN. The resulting SQL database will be part of the LTER distributed climate database. I am also continuing work on the site catalog. These efforts are punctuated by system administration duties, we now have our erasable optical jukebox on line (more about that later), Release 6 of ARC/INFO showed up so now we can switch our window manager on the SUN’s to Openwindows since all of our software will now run in X-Windows.

-- Mark Klingen, Bonanza Creek LTER

CWT -- Since December of 1991 we have been using the PathFinder Professional GPS units at the Coweeta Hydrologic Laboratory. Our primary consern was to find out if forest canopy would interfere with the signal. It soon became apparent that the terrain was our major concern. Nevertheless, by using a 25 foot telescoping range pole, and finding out the optimal time window (usually 2-3 hours per day) for satellite constellation we were able to collect good data.

The GPS units are currently being used to locate the coordinates for the corner of the vegetation plots and the center of the gap plots for our gradient study. It is also being used for establishing ground coordinates points to help us in the retification of aerial photos. In addition, it will be used to "pin-point" climate stations and weirs.

-- Gildo Calabria, Coweeta LTER

HBR -- The Hubbard Brook Sample Archive system is now in operation. The archive system provides a central permanent storage place for samples, and allows users to access information about samples and also to subsample. At the present time, approximately 15,000 samples have been barcoded, stored in the Sample Archive building and entered into the system’s database. Types of samples include precipitation and stream samples, various soil and vegetation samples, tree ’cookies’, and tree cores.
 
The samples are stored in the Sample Archive Building located at Hubbard Brook. The building consists of a large, unheated room in which all samples except water samples are kept, and a smaller room, heated to just above freezing in winter, in which water samples are stored. Each sample is labeled with a barcode which uniquely identifies it and provides a link to a database record. Subsampling is limited to 10% of a sample, and is done in the archive building, unless permission is granted to remove the sample.
 
Information about samples is stored on a laptop computer and actually consists of two dBase databases. One database stores information for each individual sample, including collection title, collector, date of collection, location, type, and short description. The other database includes information on each ’collection’ of samples, including preservation and handling of samples, analyses that have been done on the samples, and general information about the collection (e.g., publications). A dBase program links the two databases and allows the user to access all information about a sample by entering the barcode. Present efforts include obtaining copies of the data associated with sample collections and storing them in the archive building so they are available to users.

-- Cindy Veen, Hubbard Brook LTER

NTL -- We have been developing quality control screening for our climate data. Our site has maintained its own weather station at the local airport since 1988. The data are recorded by a Campbell CR-10 data logger. We also operate a more limited raft station on one of our study lakes for an evaporation study.

We plan to implement three levels of screening: (1) immediately after download from the data logger to detect instrument malfunctions, (2) prior to the production of summary tables and graphs to remove known problems, (3) after review of the tables and graphs to correct any further problems. We have also defined a set of data flags which alert the user to any uncertainty in values.

We plan to design an INGRES database for the climate data this spring. Currently, we have all our water chemistry and fish data in INGRES databases. We have learned a lot about SQL and have been generally pleased with our experience with INGRES so far.

-- Barbara Benson, North Temperate Lakes LTER

PAL

  1. The Palmer Antarctic LTER (PAL) highlights:
    OCT 91: First field season began
    NOV 91: First annual time-series
    large-scale cruise
    DEC 91: Weekly small-scale sampling begun
    JAN
    92: First automatic weather station installed
  2. First field season is in progress (began 11 October 91 and ends in March 92) and going well with sampling including hydrography, optics, nutrients, productivity, krill, antarctic silverfish, skua, and penguin field work. There were 10-12 lter people on site except during the November cruise when the team increased by 6. The ship cruise was held in November as planned although large amounts of ice impacted the large-scale grid sampling. The routine zodiac sampling on the fine-scale grid began in December when ice cleared out of the harbor and continues on a weekly or twice weekly schedule.
  3. Two GPS units (one LTER; one Dept. Geography, UCSB) were used during the first field season to establish exact locations of large and fine sampling grid site locations.
  4. A Polar Automatic Weather Station was installed 5 Jan 92 near Bonaparte Point, approximately 1/4 mile from Palmer Station. This is the first of four planned weather stations in the vicinity. Data is collected by the satellites and downloaded to the Palmer site satellite receiving system and later to the weathernetwork at Univ of Wisc.
  5. Palmer station weather watch data has been put into an accessible on-line form. Ice observations have been added to the weather record. Discussions continue with respect to upgrades to the Palmer station weather record including the possiblity of automatic recording of fine scale wind and temperature.

-- Karen Baker, Palmer LTER

Name Searches of the LTER Personnel Directory by Electronic Mail

A new LTERnet function is now available enabling users to search the LTER Personnel Directory for names via electronic mail. I hope this will prove a convenient way to look up people’s phone and FAX numbers, e-mail addresses, site, etc. Based on your input, I am planning to implement more extensive search capabilities in the coming months, including more key words for Personnel Directory searches and search addresses for other information available on-line at LTERnet (e.g. LTER Catalog of Core Data Sets, Sattelite Image Catalog). Please send me your comments, suggestions and problem reports.

To retrieve from the Personnel Directory all entries whose</li>first, middle or last name contains the a certain character</li>string, follow these steps:

1) Send your message to Address@LTERnet...

------------------------------------------

Depending on your network, send the message to

Internet: Address@LTERnet.Washington.edu
Bitnet: Address@LTERnet
Forest Service: LTERNET:X400 (FORWARD: Address)
Omnet: (site:internet, id:<Address(a)LTERnet.washington.edu>)
Usenet: uw-beaver!LTERnet!Address
NASAmail: (site:internet, id:<Address(a)LTERnet.washington.edu>)
Ignore the "Subject:" line or any other message header lines</li>your mail system may ask you for.

2) Prepare the message message

---------------------------------------------------

That message needs to contain only one line - the first line,</li>all other lines are ignored.

The FIRST LINE OF THE MESSAGE MUST START</li>WITH THE KEY WORD "Name: " (the word Name followed</li>by a colon (:) and a Space ( )), followed by a search</li>string.

The search will retrieve all Personnel Directory entries</li>whose first, middle or last name contains your search</li>string.

The search is not case sensitive (there is no difference</li>between upper case and lower case letters). The character</li>string may contain the WILDCARD CHARACTER asterisk</li>(*), which matches ANY string. A wildcard (*) is</li>always appended to your search string, so a trailing asterisk</li>in your search string is redundant. Because of this, however,</li>if you provide NO STRING, ALL ENTRIES will be</li>returned to you. The result of the search will be returned to</li>you by electronic mail (automatic mail reply).

Examples:

A) name: go

will return entries for all names that start with "go" or "Go",</li>such as Gordon Grant and James Gosz.

B) name: *ha*

will return entries for all names that contain "ha" (or</li>"Ha","hA","HA"), such as James Callahan or Hank Shugart</li>C) name: *f*in*

will return all names that contain "f" followed somewhere</li>by "in", such as Jerry Franklin or Debra Coffin

-- Rudolf Nottrott, LTER Network Office

Commentary


Professor Gordon's Rule of Evolving Bryographic Systems

While bryographic plants are typically encountered in substrata of earthy or mineral matter in concreted state, discrete substrata elements occasionally display a roughly spherical configuration which, in presence of suitable gravitational and other effects, lends itself to combined transitory and rotational motion. One notices in such cases an absence of the otherwise typical accretion of bryophyta. We therefore conclude that a rolling stone gathers no moss.

Theory Of International Society Of Philosophic Engineering --

In any calculation, any error which can creep in will.