Water is the natural capital of the growing world population. Services built on our natural capital are the currency of the 21st century. The timing and spatial distribution of surface water quantity—and the variability in quality of that water—define how we design and build the infrastructure necessary for our energy, agriculture, mining, transportation, and industrial sectors.
- Quality management system
- Network design
- Data management
A quality management system (QMS) is a set of standard operating procedures that govern the data production process to ensure that the data are of consistent, known quality. Every monitoring program requires clear objectives for 1) data quality, 2) service, and 3) security that are closely linked with the needs of the end users. The QMS provides rules to direct and control an organization toward meeting these quality management objectives.
1. Quality Management System
- US Geological Survey (USGS) techniques and methods
- USGS techniques of water resources investigations
- ISO Technical Committee 113
- World Meteorological Organization (WMO) operational hydrology reports
- Achieving the desired service objectives is primarily a function of the balance between
- Staffing (e.g. response time for instrument failure);
- Equipment specifications (i.e., instrument reliability);
- Life-cycle management of equipment (i.e., calibration and control procedures);
- Efficiencies in data production (e.g., automated notifications, auto-corrections, and auto-publication); and
- Feedback from the data production process (e.g., sufficient metadata to support a continuous improvement process).
Network design is an ongoing process with new stations being established and existing stations being discontinued as program priorities and funding evolve. This process must be managed with selective thinning and pruning, while nurturing new growth to fill data voids. Updating the design of a network is fundamentally a sampling problem. The challenge is to find the right balance between hydrometric monitoring objectives and site desirability.
2. Network Design
- Data persistence (i.e., a well-selected location should produce data for generations to come)
- Data quality (e.g., conformance with underlying assumptions)
- Data representativeness (i.e., relevance to ungauged locations)
- Operational costs (e.g., site access)
- Liability risks (i.e., occupational and/or public safety)
- Selection of methods (e.g., use of rating curve vs. index velocity method)
- Reliability risks (e.g., exposure to vandalism)
Selecting the best technology for a given location is more complex than ever before. Even when choosing a simple pressure transducer, a hydrologist must consider the type (e.g., piezoelectric, capacitive, inductive, potentiometric, vibrating wire, vibrating cylinder, or strain-gauge) and the method of deployment (e.g., bubbler, vented, or compensated). For each combination of these technologies there are numerous vendors and products available–and each product has a performance specification that can be characterized by an error band, hysteresis, resolution, sensitivity, and time constant.
- Reliability requirements: An acceptable mean time between failures
- Accuracy in the deployed setting: The blanking distance of some acoustic Doppler current profilers, for example, may be too great to correctly measure discharge for some stream geometries.
- Cost of site access: For remote sites, the incremental costs of an acoustic Doppler velocity meter for use with an index-velocity model may be easily recouped by reduced site visits.
- Local site factors: High sediment transport, algal blooms, and river ice are all factors that warn against deploying expensive submersible technology.
- Instrument sensitivity and precision: Relates to the time and effort spent on post-processing of the data.
- Training and familiarity: Limiting the variety of products deployed in a region can greatly reduce both the training burden and the likelihood of blunders caused by a lack of familiarity with a specific device.
No investment in technology can compensate for poor choices in data collection and data handling. Errors by procedural blunders are the most difficult to detect and correct in data post-processing. Training accelerates the rate that competencies are gained while simultaneously reducing the frequency of blunders. Training is, arguably, more important than ever. The demographic in many monitoring agencies today has a double hump of new recruits and pre-retirees, creating an urgent need to compensate for loss of experience with improvements in knowledge.
- USGS Surface Water Training
- World Hydrological Cycle Observing System (WHYCOS)
- University of Idaho
- Humboldt College
- Comet Training
5. Data Management
Improvements to hydrological monitoring programs often focus on field-based technologies. What is frequently overlooked is how the data are managed after acquisition. Hydrologic data are complex. Stream hydrographers are responsible for storing, validating, analyzing, and reporting on vast amounts of water data.
- Improved confidence in extrapolation (within the range of known channel geometry);
- Improved agreement on a solution (i.e., different hydrographers will independently produce similar results); and
- Improved defensibility of results (i.e., rating curve parameters help to constrain the solution).