An Analysis of the Potential Impacts on Groundwater Quality
of On-Site Wastewater Management Using
Alternative Management Practices

David Venhuizen's Business Card waterguy@venhuizen-ww.com
David Venhuizen, P.E., waterguy@venhuizen-ww.com
5803 Gateshead Drive.
Austin, Texas 78745 USA
tel. 512-442-4047
fax 512-442-4057

Copyright 1995 David Venhuizen


1. Introduction  
2. Overview of Soil Disposal Practices  
3. NR 140 Requirements  
4. Improving upon the Conventional System  
5. Pollutant Attenuation Mechanisms in the Soil  
  5.1 Organics and Solids  
  5.2 Nitrogen  
  5.3 Phosphorus  
  5.4 Bacterial Pathogens  
  5.5. Viral Pathogens  
  5.6 Trace Organics  
  5.7 Trace Inorganics  
  5.8 Summary  
6. Town of Washington Project Disposal Field Performance  
7. Modified At-Grade LPD Disposal Field Design  
8. Drip Irrigation — The Ultimate Disposal Field  
9. Summary and Conclusions  
Citations  
Images  

 

1. INTRODUCTION

The Wisconsin Department of Industry, Labor and Human Relations (DILHR) is in the process of revising the codes which regulate on-site/small-scale wastewater management in the State of Wisconsin. In formulating the revisions, DILHR's general strategy is to implement a code which prescribes a level of performance in regard to the quality of percolate delivered to groundwater. It is important to examine the factors which dictate quality of that percolate. This report is a review of those factors.

The original version of this document was produced in support of the Town of Washington's effort to gain approval for the use of "alternative" on-site wastewater systems with disposal fields installed where there was less than the code-specified soil depth to a limiting condition. Therefore, some of the discussion and examples use as a context the conditions on Washington Island. However, the underlying principles apply to any area of Wisconsin where limited soil resources pose a problem for on-site/small-scale wastewater management.

 

2. OVERVIEW OF SOIL DISPOSAL PRACTICES

The focus of attention when discussing on-site wastewater management is often making the water go "away". But it is obvious that there really is no "away." Wherever it goes, the water remains in some part of the hydrologic cycle. What is really desired is to prevent the effluent -- or more correctly, the pollutants it contains -- from causing water quality and public health problems when it gets to wherever "away" is. The point is to recycle water and nutrients back into nature in a manner which does not upset the local ecology.

In a conventional on-site system, schematically illustrated in Figure 1, there are only two routes by which the water can get "away." One route is percolation down through soil under the trench. Hopefully, the effluent will receive adequate treatment as it filters through the soil. Percolation of septic tank effluent through some minimum depth of unsaturated soil is the foundation of all regulations governing conventional on-site systems. The "magic number" for required soil depth varies among the states. Wisconsin's code sets it at three feet. [1]

Figure 1 Fate of Water Discharged to an On-Site Wastewater System (30K)

The other route by which water can get "away" is less direct. Water can be "wicked" out of the trench and held in the soil by matric potential, which is the suction force caused by air-filled voids in the soil, just like capillary action draws water up a tube. As the soil becomes wetter -- than is, as saturation increases -- matric potential decreases. At the degree of saturation called "field capacity" matric potential can no longer overcome the force of gravity, and water starts percolating downward. But, if the soil were dry enough, some water could be wicked to the surface and be lost to the atmosphere by evaporation. If the trench is shallow enough, or if plant roots run deep enough, water held in the soil by matric potential can be taken into the roots, to be lost to the atmosphere by transpiration out of the leaves.

The combined action of surface evaporation and plant transpiration is called evapotranspiration (ET). Other than nutrient uptake by plants, ET does not directly eliminate pollutants. If effluent water is retained in the soil for some time by matric potential rather than percolating on through in short order, a number of biological and chemical mechanisms are given a better chance to remove pollutants from the water. This matric potential is maintained by the soil moisture deficit created by ET losses. The result is a lower mass loading of pollutants percolating down into the groundwater over the annual cycle when ET is maximized.

With this as background, it is easy to see how conventional trenches fail. One mode of failure is effluent appearing on the surface in or near the disposal field area. This occurs when water cannot percolate very fast due to "tight" soils or a clogged infiltrative surface. When water is loaded at rates higher than can be lost by ET from the bed area -- typically the case under system sizing allowed by the Wisconsin code -- the trench will fill up and seepage to the surface will occur. This is the most recognized mode of system failure. Problems caused by such "hydraulic" failures include potential for spread of disease and a general nuisance, as well as water pollution when rainfall runoff washes surfacing effluent into streams, lakes and rivers.

The other, less recognized failure mode is percolation of effluent to a limiting condition (bedrock, groundwater, or impermeable barrier) without its having received adequate treatment. This is the mode of failure with which this report is mainly concerned. As shown in Figure 1, this can result in pollution of groundwater or surface waters.

As a general principle, just because conventional on-site system trenches may fail to function adequately in some soil and site conditions, it does not mean that a strategy of decentralized soil disposal systems must be abandoned. Examination of Figure 1 reveals that any soil disposal system consists first of pretreatment -- which is only the septic tank in all variants of on-site systems recognized in the current code -- then further treatment in the soil. This suggests two approaches to providing more environmentally sound management when dealing with limited soil resources:

(1) Provide better pretreatment than is provided by the septic tank before the effluent is routed to the soil disposal system.

(2) Use disposal methods which maximize the treatment capabilities of whatever soil resources that are available.

This report concentrates on the latter proposition. As the discussions will make clear, however, enhanced pretreatment prior to disposal can not only reduce the "load" on the soil system, but can also help to maximize the pollutant assimilation/removal capabilities of whatever depth of soil is available. An example of a system providing enhanced pretreatment is the denitrifying sand filter which was the focus of the Town of Washington's demonstration project (and is also being studied in a DILHR-funded project at Black River Falls). This system employs a modified recirculating sand filter concept and produces an effluent with BOD5 and TSS concentrations typically below 10 mg/L. The concept is also capable of significantly reducing total nitrogen concentration in the effluent, typically to less than 20 mg/L, how much less depending largely on the influent nitrogen concentration. The rate of reduction observed was generally in the range of 60% to 90%. Effluent fecal coliform levels were mostly in the range of 103-104 CFU/100 mL, which is a 2-5 log reduction (99%-99.999% reduction) from levels typically observed in septic tank effluent. Based upon the results of the demonstration project, DILHR and the Town concluded that pretreatment before disposal is feasible.

The basic ideas of enhanced pretreatment before disposal and of improved disposal methods are already accepted in current Wisconsin codes. The mound system -- which is essentially a buried sand filter installed above the receiving soil -- is an example of the former, and the "at-grade" system is an example of the latter. It remains to explore what further improvements could most effectively enhance pollutant elimination in situations where soil resources are limited even more severely than these methods can accommodate.

 

3. NR 140 REQUIREMENTS

Before proceeding to investigate improved soil disposal systems, a look at the regulatory framework within which their performance would be evaluated is in order. While this report basically examines the potential impacts of the proposed decentralized management methods without regard to the specifics of regulatory constraints, it is recognized that the regulatory mechanism through which approval of these methods may be obtained is compliance with Ch. NR 140, Wis. Adm. Code, commonly referred to as simply "NR 140" and popularly known as the "groundwater law". [2] This section reviews the dictates of NR 140 and discusses how interpretation of this code would impact upon the evaluation of whether alternative management methods can be expected to comply with it.

NR 140 sets limits for many substances which may pollute groundwater. This report reviews expected performance of soil disposal systems for categories of pollutants which cover most of these substances. However, for small-scale wastewater systems treating mainly domestic type wastes, the two pollutants of major concern are nitrates and bacterial indicators.

Before proceeding, it is noted that recent action by the Wisconsin legislature has effectively exempted wastewater systems regulated by DILHR from nitrate standards. However, to assure protection of groundwater quality, there are some areas of the state -- e.g., Door County, the central sands -- where attention should still be paid to nitrogen loading. Therefore, this report gives due consideration to minimizing percolation of nitrates into groundwater.

NR 140 defines a "preventative action limit" and an "enforcement standard" for each pollutant. Table 1 in section NR 140.10 lists these standards for pollutants which may impact public health. For bacterial indicators, both the enforcement standard and the preventative action limit are, in essence, no detectable coliform bacteria. For nitrate, the enforcement standard is 10 mg/L and the preventative action limit is 2 mg/L. These two limits relate to the responses which the Wisconsin Department of Natural Resources (DNR) may take to their violation. Review of NR 140.24, relating to responses when a preventative action limit is exceeded, leads to the presumption that, for nitrates, assuring no violation of the enforcement standard is the "goal" of soil disposal systems.

In any case, NR 140 limits apply only at a "point of standards application", such points being defined in NR 140.22, quoted in part here:

NR 140.22 Point of standards application.

(1) Facilities, practices, or activities regulated by the department shall be designed to minimize the levels of substances in groundwater and to comply with the preventative action limits to the extent technically and economically feasible at the following locations:

(a) Any point of present groundwater use;

(b) Any point beyond the boundary of the property on which the facility, practice or activity is located; and

(c) Any point within the property boundaries beyond the 3-dimensional design management zone if one is established by the department at each facility, practice or activity ....

(2) The point of standards application to determine if a preventative action limit has been attained or exceeded is any point at which groundwater is monitored.

(3) The point of standards application to determine whether an enforcement standard has been attained or exceeded shall be the following locations:

(a) Any point of present groundwater use;

(b) Any point beyond the boundary of the property on which the facility, practice or activity is located;

(c) Any point within the property boundaries beyond the 3-dimensional design management zone if one is established by the department at each facility, practice or activity ....

Figure 2 provides an illustration of these points of standards application. The boundary of the "design management zone" is specified in NR 140.22(5). The horizontal distance from the edge of the "waste boundary"--the edge of the disposal field in the present case -- to the limit of the design management zone is listed in Table 4 of that section. For the activity in question -- land disposal of wastewater -- the distance is 250 feet, as shown in Figure 2. Applying NR 140 strictly, only points of groundwater use (wells) and groundwater monitoring points beyond the property boundary or over 250 feet from the edge of the disposal field, whichever is nearer, can be used to determine compliance with the enforcement standard.

Figure 2 NR 140 Points of Standards Application (42K)

It must also be understood that NR 140 applies only to groundwater quality. The quality of percolating effluent in the vadose zone at any given distance below the point where effluent is injected into the soil will provide an indication of the quality of water which may flow into the groundwater. But this is not groundwater as defined in NR 140.05(9), and violations of the standards in percolate samples are not definitive indicators of failure to meet the dictates of NR 140.

In some cases, the circumstances of groundwater recharge and use are such that simply assuring strict compliance with NR 140 may not completely protect local groundwater quality. An example is the Eastern Dolomite aquifer in Door County. This aquifer is recharged through faults and fissures in the upper parts of this strata and in overlying formations. Effluent percolating down to the bedrock surface may flow through these conduits into the groundwater pool. Little additional pollutant attenuation should be expected once the water has passed from the overlying soil into the bedrock formations. If effluent is not adequately treated in the soil system above the bedrock surface, pollutants will discharge directly into groundwater.

With the restrictions on points of standards application set by NR 140, the only practical indicators of local groundwater quality would probably be wells in the immediate vicinity. Such factors as well depth, groundwater dynamics, and well placement relative to the disposal field (by code, at least 50 feet horizontally away from the edge) would determine the probability that any pollution issuing from on-site disposal fields would be detected in these wells at any given point in time. Therefore, non-compliance with NR 140 -- if it were to occur -- would likely not be detectable until many disposal fields were installed in thin soils over these bedrock formations, and then perhaps not for a considerable length of time.

Such an a posteriori approach to determining compliance is poor public policy, since remediation of groundwater quality problems can be very difficult. It remains, therefore, to offer reasonable assurance that NR 140 limits would not be attained or exceeded in the groundwater pool generally if alternative management practices were implemented, without regard to whether or not this could be unequivocally demonstrated at points of standards application defined by NR 140. This paper examines that proposition.

 

4. IMPROVING UPON THE CONVENTIONAL SYSTEM

The typical trench design as specified in the Wisconsin code is illustrated in Figure 3. [1] Effluent is placed deep in the soil. This depth to the infiltrative surface greatly reduces the potential for water loss through evapotranspiration and nutrient (mainly nitrogen) loss through plant uptake. Specifying deep placement of effluent in the soil reflects a focus on disposal over treatment.

Figure 3 DILHR Standard Conventional Disposal Trench System (36K)

It is essential, particularly in coarser soils, that a "clogging mat" forms at the gravel/soil interface of a conventional trench. [3] This mat creates a zone of restricted flow, helping to assure that flow through the soil beyond the clogged zone would be unsaturated. As will be detailed, unsaturated flow is necessary to attain good treatment for pollutants of critical concern, particularly bacterial and viral pathogens. It was noted previously that, under the theory governing conventional disposal field design, as long as this unsaturated flow path is long enough, adequate elimination of pollution is assumed.

Another feature of conventional trenches is that they are "gravity dosed"; that is, water flows by gravity through the drainfield pipe as it comes out of the septic tank whenever water is run into this tank from the house. Typically this gravity dosing does not achieve uniform distribution of effluent throughout the field, resulting in localized loading rates far higher than the "design" loading rates, a circumstance which has been reported in numerous investigations. [4,5] Particularly in coarse-grained soils, this non-uniform distribution can result in saturated flow through the soil below areas of the field which do receive effluent. Converse, et al. [6] state that sandy soils absolutely require pressure distribution, especially at system startup, due to the high transmission rate of these soils, resulting inevitably in poor distribution and consequent poor treatment with gravity dosing. As an example, Ver Hey and Woessner [7] report that, in a conventional drainfield installed in coarse-grained soils, little of the field was receiving effluent, and poor bacterial removal resulted.

As pointed out by Otis, Plews & Patterson [8], the benefits of sidewall absorption are broadly recognized. But little of the trench sidewall is engaged with gravity dosing. Flow peaks are attenuated by house plumbing and the septic tank, so flow surges are not large enough to pond water to any significant depth in the trench. (Indeed, as just noted, they are not large enough even to distribute effluent over the entire trench bottom.) Significant sidewall absorption would only occur if the entire trench bottom was on the verge of hydraulic failure, forcing effluent to pond in the trench all of the time.

These discussions highlight that maintenance of "proper" operating conditions in a conventional, gravity dosed trench is a delicate "balancing act" between a clogging mat sufficient to assure unsaturated flow in the underlying soil and hydraulic failure from too complete a clogging action. Often, due to localized overloading caused by non-uniform distribution, the portion of the trench receiving flow will become clogged, forcing effluent to flow further down the trench. After a time, this portion also becomes clogged, again because of localized overloading, and effluent is forced yet further down the trench. Finally, the entire trench becomes clogged. This progressive clogging of the trench is known as "creeping failure". [4,9]

A progressive range of modifications to the conventional trench can be entertained in an effort to enhance the soil's treatment capabilities. The first is pressure distribution of the effluent. Pumping effluent into the disposal field pipe would typically create a large enough flow surge that effluent would be distributed along its entire length. This would minimize localized overloading and the consequent potential for creeping failure. The benefits of pressure dosing are recognized by the current Wisconsin code. [1]

Inherent in any practical pressure distribution system would be a short-term "dose/rest" loading cycle. The pump would come on and run a dose of effluent to the field. No more effluent would be loaded on the field until another dose builds up in the effluent tank, which, by design, is many hours later. [1] This circumstance limits the amount of effluent loaded on the field at any one time to the dose volume. Intermittent dosing in this manner provided two primary benefits: (1) it minimizes the tendency toward continuous ponding in the trenches and consequent severe clogging, especially in finer soils; (2) it minimizes the potential for development of saturated flow, especially in coarser soils, with consequent poor treatment of the percolating effluent. [4,10]

Treatment in the soil system benefits from enforcing lower localized loading rates. Canter & Knox [11] and Gerba & Goyal [12] provide indications that the efficiency of both straining/filtration and adsorption processes are decreased by higher infiltration rates. It is beneficial to go beyond simply assuring unsaturated flow. Employing pressure distribution can ensure that a field is loaded more uniformly, which -- assuming the field is appropriately sized -- trends to enforce lower flow rates through all areas of the soil system.

A further benefit of pressure dosing generally is that the dose/rest loading cycle provides the opportunity for the soil interface to aerate between doses. A dose is typically completely diffused into the soil before the next one is applied, allowing air into soil voids. As noted, this "resting" of the infiltrative surface minimizes the potential for severe clogging of the trench.

The next step is to redesign the trench and the distribution system to take maximum advantage of these benefits of pressure dosing. The current Wisconsin code details design specifications for a small diameter pipe pressure distribution system, but it does not recognize the benefits of modifying trench design. The code requires pressure dosed trenches to be constructed just like conventional trenches. [1]

Shown in Figure 4 is the shallow, narrow trench design favored for use in the low pressure dosed (LPD) system. [13] A small diameter lateral pipe with drilled holes -- typically 1/8" to 1/4" in size -- distributes effluent along the trench. Taking system hydraulics and relative trench elevations into account, the number and/or size of holes in each lateral can be varied to provide a roughly equal flow volume into each trench. These lateral pipes are pressurized so that a minimum head -- typically about 2.5 feet -- is obtained at the distal end, assuring fairly uniform flow out all the holes.

Figure 4 Low Pressure Dosing Trench Installation Detail (36K)

The field design generally employs a long total length of these shallow, narrow trenches, which maximizes the ratio of sidewall to bottom area. System design assures that the dose is of sufficient size and the instantaneous flow rate into the trench is sufficiently high that the trench is mostly filled by each dose. This enforces maximum utilization of sidewall absorption. [13,14] Further, using shallow trenches offers the obvious advantage of maximizing the treatment effectiveness of whatever depth of soil above a limiting condition is available.

In addition, shallow placement enhances ET losses when conditions are favorable. The field is typically designed as an array of parallel trenches, as Figure 4 illustrates. When sufficient matric potential exists -- that is, when soil moisture level is sufficiently below field capacity -- water is "wicked" into the intertrench spaces. If the infiltrative surface is closer to the ground surface, the soil around it will become drier more quickly and will tend to stay drier during periods of little precipitation and significant ET potential. This occurs at least throughout a considerable portion of the growing season in the climate of Wisconsin. Effluent water would be held in this near-surface soil horizon by matric potential, allowing pollutant removal and assimilation mechanisms to work as well as possible.

The effluent loadings themselves make a significant contribution to soil moisture levels, thus decreasing matric potential and the consequent "wicking" action. Therefore, as noted previously, it is beneficial to employ light areal loading rates; that is, to design the field using a lower flow per square foot of field area. This not only helps to maximize ET losses, it results in lower nitrogen loading rates so that a greater portion of the applied nitrogen is likely to be removed by plant uptake, leaving less to percolate to groundwater.

Because effluent would disperse laterally, calculation of field loading rate allows credit for the intertrench spaces. So, even though the field loading rate is low relative to the bottom loading rate of a conventional trench, the loading rate on the soil interface of the trench is typically somewhat higher than conventional trench bottom loading rates. [1,13] Even when no treatment beyond the septic tank is provided, it has been found that this higher "face" loading rate rarely causes trench clogging problems. Lack of clogging is credited to the dose/rest loading cycle, which allows the soil interface to aerate between loadings, retarding clogging mat formation. [4,14,15]

The obvious objection to installing and operating a field in a manner that minimizes clogging mat formation is that conventional system design theory holds that this clogging mat is necessary to assure adequate treatment of effluent in the soil system. While the clogging mat is itself an effective "retainer" of particulate pollution, its primary function is to restrict flow rate out of the trench into coarser-grained soils, ensuring that unsaturated flow would be maintained. In an LPD system, unsaturated flow is effectively maintained with a dose/rest loading cycle and uniform distribution, which prevent high localized loading rates common in gravity trenches. [4,14] Especially in coarse-grained soils, minimizing saturated flow with dose/rest loading rather than by relying on clogging mat formation offers more reliable treatment. Otis, et al. [8] point out the problems of forming and maintaining an adequate, but not overly restrictive, clogging mat in a conventional disposal trench or bed.

If enhanced pretreatment is provided, the higher effluent quality offers further assurance that clogging of the infiltrative surface would not be an operational problem. Research and field experience shows that pretreatment to higher quality results in higher acceptance rates by the soil. There is no general agreement on which causative agents play the most active role in clogging mat formation, but most investigators agree that BOD5, suspended solids and coliforms are primarily responsible. [15,16] As noted previously, a sand filter pretreatment system, for example, would drastically reduce all these constituents below the levels found in septic tank effluent.

This suggests that provision of better pretreatment is itself another logical step toward the development of a more environmentally sound management system for limited soil resources. When adequate pretreatment is provided, the clogging mat's filtration function is no longer important to the overall level of treatment achieved in the soil system. Also, the lower pollutant loadings should be more readily removed from soil water and assimilated by the soil system. As will be detailed, this is particularly so when applied at low areal loading rates uniformly over the field area with a dose/rest loading cycle.

These discussions indicate that modifications to disposal field design can enhance the treatment capability of whatever depth of soil is available. Before proceeding to discuss some possibilities for alternative disposal field designs, specific indications of how alternative concepts can enhance removal of various pollutants are reviewed.

 

5. POLLUTANT ATTENUATION MECHANISMS IN THE SOIL

Attention is now turned to the fate of several categories of pollutants in the soil system. Each is examined in regard to mechanisms by which it can be eliminated from soil water and to characteristics of the soil system that can enhance or retard those mechanisms. Findings will be applied toward determining how system design can promote removal and assimilation.

5.1 Organics and Solids

These pollutants are the primary determinants of the effluent's overall level of "dirtiness". Organics are commonly measured as BOD5 -- the 5-day biochemical oxygen demand -- which indicates the presence of both readily degradable soluble organics and particulate organic matter. The total suspended solids (TSS) test is a measure of the latter plus refractory solids.

It can be inferred from the performance of sand filters (e.g., ref. 17) that filtration through just a couple feet of unsaturated depth in almost any soil would serve to reduce these parameters to fairly low levels. The major importance of organics and solids to soil absorption systems is their impact on clogging, and the attendant possibility of surfacing effluent.

Note that unsaturated flow is required to obtain adequate treatment. Reneau, et al. [4] list conditions contributing to unsaturated flow as uniform effluent distribution, development of a clogging mat (for coarse-grained soils), well-drained soils, and moisture deficits. The latter highlights the benefits of designing the system to maximize ET losses, since this is what creates moisture deficits. As noted previously, a short-term dose/rest loading cycle can also help to maintain unsaturated flow. Especially in coarse-grained soils, control of saturated flow with dose/rest loading rather than by relying on clogging mat formation offers more reliable treatment.

The method of distribution also helps to prevent clogging due to organics and solids loadings. Better acceptance of effluent water by the soil is obtained in an LPD system. Hargett [18] found that, although subjected to "... extremely stressful soil and background moisture conditions, resulting in severe continuous ponding and surcharging ... none of [the systems under investigation] showed indications of prolonged severe clogging of the trench interface."

As noted previously, it is also known that pretreatment results in higher acceptance rates by the soil. Studies conducted in Oregon found that sand filter effluent was accepted at much higher rates than septic tank effluent, a circumstance attributed to reduced clogging. Effluent acceptance rates that ranged from 2.3 to 7.7 gal/ft2/day were observed. [15] Loudon & Birnie [19] studied trenches receiving sand filtered effluent and concluded they were accepting water at a rate eight times that of the design loading rate for septic tank effluent in the soil under observation. Siegrist [16] suggested that soils could be loaded with sand filter effluent at rates 7.5 times those allowed for septic tank effluent. He also reported other experiences where "[i]ntermittent sand filter effluents have been successfully applied to subsoils at rates as high as 20 cm/d [5 gal/ft2/day] without causing hydraulic failures, even in fine-textured soils and saprolites."

The aim of most such investigations has been to provide a basis for installing smaller disposal fields when enhanced pretreatment is employed. But, as will be detailed, increasing the areal hydraulic loading rate may compromise removals of other contaminants. Again, it must be borne in mind that the goal is not to simply make sure that the effluent goes into the ground, but to assure that it is treated adequately in the soil system. The higher acceptance rate for pretreated wastewater can be viewed as a "fail-safe" feature, drastically reducing -- if not circumventing completely -- potential for public health problems from surfacing effluent caused by excessive clogging of the leaching interface.

In summary, reducing effluent BOD5 and TSS concentrations in a pretreatment system enhances acceptance by the soil, regardless of the type of disposal field. Improved disposal field concepts would cope with organic and solids loadings better than conventional trenches, regardless of the degree of pretreatment provided. Because of their superior performance, these improved disposal fields could also be safely sited in less pervious soils, especially if the applied effluent was treated to a fairly high quality.

5.2 Nitrogen

Of all pollutants in domestic wastewater, perhaps the most problematic is nitrogen. Typically, the majority of the nitrogen in septic tank effluent is in the ammonium form. In properly functioning disposal systems, since unsaturated (aerobic) conditions should predominate beneath the trench, much of this ammonium will be nitrified (converted to the nitrate form) in the soil surrounding the trench. Unless conditions are favorable for denitrification, nitrate is not readily removed from the percolating water, resulting in the potential for nitrate pollution of receiving waters. The fate of nitrogen in the soil system is illustrated in Figure 5, showing that removal could be through the pathways of adsorption, fixation, volatilization, biological uptake, and denitrification. Several works describe these mechanisms. [3,20,21,22,23,24]

Figure 5 Fate of Nitrogen in the Soil System (36K)

Fixation occurs when ammonium ions become trapped in the intermicellar layers of clay minerals, and fixation by organic components of the soil may also occur. [3,20,22] In most soils, the potential for ammonium fixation is limited, and this pathway is not likely to be important in the long-term nitrogen budget of the soil system. [3,23]

Nitrogen loss through non-biological volatilization of ammonium to ammonia gas is even less of a factor. Significant volatilization losses would require considerable air-water contact, and this is not afforded in subsurface disposal systems. [3,20] In any case, the equilibrium between ammonium and ammonia gas is pH dependent. Significant volatilization should only be expected at a fairly high pH, which is usually not maintained in the soil system. [20]

Adsorption can theoretically remove large quantities of ammonium from solution, assuming that the cation exchange capacity of the soil is sufficiently high. [3,20,23] However, ammonium persists for a sufficient length of time for significant adsorption to occur only under anaerobic conditions, since it is readily nitrified under aerobic conditions. [3] Further, adsorbed ammonium is not generally unavailable to nitrifiers, so ammonium immobilized during temporary anaerobic conditions would be readily nitrified if aerobic conditions are re-established. Only ammonium adsorbed in a zone that remains anaerobic would be stable. [3,20,22] Since a zone of unsaturated flow between the trench and a limiting condition would exist in any properly functioning soil disposal system, long-term removal of nitrogen by adsorption will be limited, regardless of the depth to a limiting condition.

When enhanced pretreatment is employed, all these mechanisms of ammonium removal in the soil system may be relatively insignificant to the nitrogen budget. A sand filter pretreatment system, for example, generally produces a highly nitrified effluent. Further, as just noted, any ammonium remaining in the effluent is unlikely to persist after injection into the soil. In soils to which wastewater is regularly applied, a population of nitrifying bacteria build up very quickly, and nitrification is usually complete after flow through just a few inches of soil, unless temperatures are very low or oxygen diffusion is limited. [20,22,23] Studies indicate that a high degree of nitrification is to be expected, especially in coarse-textured soils. [7,25] Even where conditions immediately below the trench are not conducive to sufficient oxygen diffusion, nitrification occurs in adjacent soils in the direction effluent flow. [4]

Biological uptake and denitrification are the only significant pathways for effluent nitrogen removal from the soil system. Noting that the denitrifying sand filter treatment system can produce an effluent with less than 20 mg/L of total nitrogen concentration, the NR 140 enforcement standard would be obtained in the percolate if a 50% reduction were to be obtained by these mechanisms.

Biological denitrification is the reduction of nitrate to nitrogen gas by bacteria that, in the absence of oxygen, use nitrate as an alternate electron acceptor. Lance [20] notes that three conditions are necessary for denitrification to proceed: (1) oxidation of ammonium to nitrate; (2) passage through an anaerobic zone after nitrification has occurred; and (3) provision of an adequate source of energy for the denitrifying bacteria in the anaerobic zone.

When septic tank effluent is routed to a disposal field, wastewater percolating out of the trench is undergoing purification in regard to organic matter content as well as undergoing nitrification. Therefore, at a point where denitrification reactions could remove significant quantities of nitrogen, little carbon (energy source) remains in the water to feed the denitrifying bacteria. When a pretreatment system is used, this circumstance may be exacerbated. A sand filter system, for example, would deliver an effluent to the field which is very low in organic carbon. Fortunately however, maintaining a high denitrification potential appears to depend more on the organic content of the soil than on organic content in the wastewater. [23,24,26]

In a conventional system, where the infiltrative surface is fairly deep, effluent flows into the subsoil, where there is typically little soil organic matter present. Because of this, near-surface disposal would enhance denitrification losses. The A and B soil horizons generally contain the vast majority of soil organic matter. Where a plant cover is maintained, this organic fraction is continually renewed. Lance [20] and Broadbent & Reisenauer [23] observed that, assuming the presence of sufficient organic matter, nitrification and denitrification can occur in the same soil profile, since pockets of anoxia are present even in soils considered to be well aerated. Denitrification potential is greater in finer-grained soils, since smaller pores would be more prone to anoxia. Though soils in many areas of Wisconsin are generally somewhat coarse-textured overall, these soils are typically finer-grained in the shallower horizons.

Estimates of potential reduction by denitrification of total nitrogen in applied effluent range from 10-15% [20] to 15-25% [24] in conventional land application systems. In an LPD system, effluent is intermittently dosed, creating alternating cycles of higher saturation -- and more pockets of anoxia -- and lower saturation, when aeration would maximize nitrification. This indicates that a higher denitrification rate may be achieved in shallow LPD systems.

Nitrification in a pretreatment process may further optimize denitrification losses. Most of the nitrogen in sand filter effluent is typically in the nitrate form. Introduction of this effluent to the soil in shallow, pressure-dosed trenches should create conditions favorable to denitrification. Broadbent and Reisenauer [23] note that temporary anoxia can be created in the soil by a passing front of saturation. This suggests that using a dose/rest loading cycle in shallow trenches could enhance denitrification. As a dose is absorbed into the soil surrounding the trench, increasing the degree of saturation, the potential for anoxia in the soil pores would increase. Therefore, nitrate in the effluent would be immediately introduced to an environment favorable to denitrification. Hoover, et al. [27] speculated that a high denitrification rate was the reason that little nitrogen was detected downslope of an LPD trench system receiving sand filter effluent.

Denitrification potential may be amplified even more in seasons during which sufficient matric potential exists to preclude significant leaching losses (assuming areal loading rates are low enough). As outlined previously, ammonium immobilized during one dosing event may be nitrified during the inter-dose period, and so would be immediately available to denitrifiers when the next dose again creates temporarily anoxic conditions. In this case, both nitrate and ammonium in the effluent would be subject to elimination by denitrification in the immediate vicinity of the disposal trench.

In a soil with a shallow and seasonally fluctuating water table, anoxia induced by saturation when the water table rises would also enhance denitrification potential. Stewart and Reneau [28] observed that most of the effluent nitrogen disappeared down-gradient of a shallow LPD system, even though nitrate concentrations increased greatly directly beneath the disposal system when the water table rose. This increase was attributed to nitrate accumulation when the water table was low and unsaturated flow was dominant. This nitrate was "flushed out" when the water table rose, but apparently was effectively removed by denitrification -- enhanced and augmented by the anoxia induced by saturation -- within a few meters of the disposal field. Many areas in Wisconsin have soils with a shallow, seasonally fluctuating water table.

Biological uptake from a conventional system is expected to be limited. Tyler, et al. [3] observe: "Extensive root systems are not present in seepage fields and therefore only limited amounts of N would be taken up." In commenting on the general uptake potential of plants, Laak [22] notes: "Wastewater from subsurface seepage field [sic], however, is not as available to plants, although some nitrogen can be removed if roots are deep."

Significant nitrogen uptake can be achieved by near-surface disposal. Using shallow, lightly loaded LPD trenches for disposal would make nitrogen in the effluent readily available to plants. Petrovic [21] reported 55-75% removal by uptake of fertilizer nitrogen in lawns on Long Island. EPA [24] lists annual uptake for Kentucky bluegrass -- a typical disposal field cover crop in the Wisconsin climate -- at 180-240 lb/acre. With an effluent total nitrogen concentration of 20 mg/L, a disposal field of the design proposed by the Town of Washington, for example, would be loaded at a rate of about 400 lb/acre/year. Not all of the nitrogen applied to the soil can be removed by uptake, however. The uptake efficiency is generally expected to be 50% or less, with perennial grasses being among the more efficient plants. [23] This implies a maximum uptake potential of about 200 lb/acre/year, which is within the range quoted for Kentucky bluegrass.

Since maximizing losses through either denitrification or plant uptake requires near-surface disposal of the effluent, relatively high removal rates could be accomplished in shallow LPD disposal fields. With these processes largely confined to the near-surface soil horizons, any further depth of soil should not be expected to have any significant impact on the amount of nitrogen leaching to groundwater. It can be concluded that, given sufficient soil matrix to support these processes, reduction in total nitrogen content of percolating waters by dentrification and plant uptake would be obtained regardless of total depth to a limiting condition.

An indication of the potential nitrogen content of water which percolates out of the root zone -- and thus presumably into the groundwater with little more attenuation of nitrogen content -- can be derived by application of an equation presented by Loehr, et al. [29]. The equation relates wastewater loading rate to relevant parameters as follows:

W = {4.43 C + a(P- ET) - cP} / {y - a - y(d+v)}

where: W = wastewater application rate (in/yr),

C = nitrogen uptake by plants (lb/acre/yr),

a = allowable N concentration in percolate (mg/L),

P = precipitation (in/yr),

ET = evapotranspiration losses (in/yr),

c = N concentration in precipitation (mg/L),

y = N concentration in applied wastewater (mg/L),

d = fraction of N lost via denitrification, and

v = fraction of N lost via ammonia volatilization.

The analysis is conducted here using conditions on Washington Island. For this analysis, a is set at the enforcement standard of 10 mg/L. The value for y is set at 20 mg/L, the maximum total nitrogen concentration typically expected in treatment system effluent. The average value for P is reported to be 28.3 inches on Washington Island. For P-ET, reference is made to Figure 3-2 in EPA's land application manual [24]. All of Wisconsin lies in a broad "bowl" between 0 and -5 inches, with Washington Island's location lying closer to the -5 line. The figure shows ET-P, so the sign is changed for this analysis. Since a lower value gives a more conservative result, a value of 2 is used here. For c, a value of 0.5 mg/L is very conservative, based on observations by the National Atmospheric Deposition Program. The value for v is set at zero, since little volatilization is expected from a subsurface disposal field. For d, a value of 0.25 should be conservative, as the arguments set forth above indicate.

Plugging these values into the equation and rearranging to solve for C yields:

C = {5W - 5.85} / 4.43

It is seen that the annual nitrogen uptake requirement C is dependent upon wastewater loading rate W. A nominal design loading rate of 0.15-0.25 gal/ft2/day is proposed for the disposal fields to be used in the Town of Washington management system. Using the middle of this range, this is equivalent to 117.1 inches/year. From the above, this yields a required uptake of 131 lb/acre/yr. This value is below the potential annual nitrogen uptake by plants typical of those planted over disposal fields in this climate.

This analysis indicates that average concentration over the annual cycle of total nitrogen in water that percolates out of the root zone would be below 10 mg/L, the NR 140 enforcement standard, when a denitrifying pretreatment system and a near surface disposal system are used. However, there would be excursions from this average through the annual cycle. During the growing season, much of the nitrogen would be eliminated by plant uptake. Also during this portion of the year, ET losses would often create sufficient matric potential, so that much of the effluent would be retained in the near-surface soil horizons, which should increase the portion of the nitrogen eliminated by denitrification. It follows that the instantaneous nitrogen concentration of the percolate is expected to be far below 10 mg/L during much of the growing season. Through the winter, however, less of the nitrogen in the effluent would be eliminated. Instantaneous concentration in the percolate during this period would be higher, at times perhaps even approaching that in the applied effluent. Though annual average concentration would be below 10 mg/L, it is unlikely that the instantaneous concentration in percolate would be less than 10 mg/L at all times throughout the year.

The limits of NR 140 are stated in terms of concentrations, implying that the concentration standard must be met at all times, regardless of the degree to which mass loadings were attenuated. While it is again emphasized that the vadose zone beneath the disposal field is not a point of standards application for NR 140, it is obvious that if nitrogen concentration in the percolate could be held below the enforcement standard at all times, this percolate could not possibly ever cause the concentration in groundwater to exceed this standard. Because -- due to the expected temporal fluctuation in nitrogen removal by plant uptake and in-soil denitrification -- it cannot be guaranteed that percolate concentration would remain below the limit, the potential impact of this percolate on local groundwater quality must be evaluated.

Dilution becomes a consideration at this point. Recharge of the aquifer -- and of shallow groundwater as well -- would derive from rainfall and snowmelt percolating over the entire area, not just from water percolating in wastewater system disposal fields. In the Town of Washington, for example, something like 400 sand filter systems might be installed over the next 20 years. Assuming an average field size of 2,000 ft2, this implies a total disposal field area of 600,000 ft2, which equates to 18.3 acres or 0.029 square miles. This area is about one-tenth of one percent of the island's 30 square mile area.

Dilution effects dictate that percolate from such a limited area -- even if it contains an average concentration of 20 mg/L at all times, implying no nitrogen removal in the soil at any time -- would have negligible effect on nitrogen concentration in groundwater generally. Because mass loadings are expected to be attenuated to the point that annual average nitrogen concentration in disposal field percolate would be less than 10 mg/L, violation of the groundwater nitrogen concentration standard at the micro scale should be precluded as well, even in an area with a high concentration of sand filter/LPD systems. The actual impact at any point would depend upon density of these systems, soil type, depth at which effluent is injected into the soil, temporal variations in climate, type and condition of the plant cover, microclimate of the site, groundwater dynamics, and the actual nitrogen loadings from wastewater system effluent. In any case, it is extremely unlikely that nitrogen contributions from these wastewater systems would cause violations of the enforcement standards.

5.3 Phosphorus

There are no standards for phosphorus in NR 140. However, when the receiving groundwater connects with surface water -- as it would probably be in many shoreline area -- phosphorus removal in the soil system may be critical to maintaining surface water quality. Reneau, et al. [4] list the mechanisms for removing phosphorus from effluent water in the soil system to include plant uptake, biological immobilization, and adsorption or precipitation processes. The latter includes physical adsorption, chemisorption, anion exchange, surface precipitation, and precipitation as separate solid phases. In conventional soil absorption disposal systems, these are by far the dominant modes of removal. These mechanisms are complex and not completely understood, and it is uncertain whether phosphorus is being absorbed or precipitated in any given instance, leading to use of the term "sorption" to cover these processes. [24]

Removal and immobilization is dependent upon the availability of "sorption sites" to bind the phosphorus. While it is possible to exhaust these sites, most soils have a very high capacity for sorption of phosphorus. Sorption sites are provided by clay and organic fractions of the soil. Therefore, sandy soils typically have lower capacities than clayey soils. The removal process typically starts with a "fast" sorption reaction, followed by a slower immobilization due to the formation of low solubility precipitates. Soils generally have a greater capacity for retention of phosphorus than is predicted by adsorption theories, since sites "regenerate" as precipitation proceeds. [3,4,11,23,24,26,30]

Sawhney & Hill [30] note the varying sorption capacities of different soils and also that, because sorption capacity is finite, the geometry of the disposal system impacts on the length of time required to "saturate" the soil to any given depth. They point out, for example, that the geometry of a leaching pit provides less soil volume for sorption within a given distance from the pit than that of a trench. The types of improved disposal fields discussed in this report spread effluent over a much larger volume of soil than would a conventional trench. Sawhney & Hill also state that deeper soil layers generally have a lower phosphorus sorption capacity, so phosphorus removal would be enhanced in shallow LPD systems.

As noted, in the absence of a significant clay fraction, organic matter can provide phosphorus sorption sites. So another advantage of shallow disposal systems in sandy soils is the higher organic content of near-surface soil horizons. Reneau, et al. [4] report that effective phosphorus removal is typically achieved in all but coarse sandy soils. They also report that phosphorus is effectively sorbed under saturated conditions. Therefore, for sites where phosphorus removal may be significant to surface water quality, sorption would continue after effluent water exited the vadose zone. Shoreline setback requirements dictate that this effluent would flow with groundwater for some considerable distance before joining surface waters, greatly increasing the amount of soil available to sorb the phosphorus before it reaches surface waters.

For the same reason that nitrogen uptake would be enhanced, shallow disposal fields would also maximize phosphorus removal by plant uptake. Though typical mass loadings of phosphorus in any reasonably sized disposal field would be quite in excess of typical uptake rates, this mechanism could still provide significant removal. For example, if effluent contains 10 mg/L of phosphorus and is loaded at a rate of 0.15 gal/ft2/day, the annual mass loading rate would be about 200 lb/acre/year. EPA [36] lists annual phosphorus uptake by Kentucky bluegrass as 40 lb/acre/year, or about 20% of the annual load.

5.4 Bacterial Pathogens

Septic tank effluent can contain a significant number of pathogens. Gerba & Goyal [12] report that septic tanks remove 50-90% of bacteria, none of the protozoan cysts, and 50-90% of helminth eggs from domestic wastewater. A number of sources report the level of fecal coliforms -- the standard "indicator" bacteria -- in septic tank effluent at 106-108 MPN. [11,31,32] Once discharged to the soil system, pathogens can be removed by filtration, sedimentation and adsorption. [4,12,33,34] Being relatively large, cysts and eggs are more readily filtered out in the soil system than bacteria, so if conditions for removal of the latter are favorable, the former should be removed as well.

Physical straining (filtration) is the main limit to travel of bacteria, so bacterial removal efficiency is typically inversely proportional to soil particle size. [11,33] Hagedorn, et al. [33] report on studies which showed reduction of bacterial levels in septic tank effluent to those obtained from "control" soil samples within 61 cm (2 feet) of trench bottoms. They concluded, based on this and similar observations, that "... approximately 30-90 cm [1-3 feet] of soil beneath the base of the drainfield trench was adequate for complete bacterial removal of [sic] septic effluents provided the soil has both a layer permeable to effluent flow [to assure unsaturated flow] and another region adequately restrictive to form a clogged zone." [emphasis added]

Similarly, Tyler, et al. [3] state: "At a distance of 1 foot into the soil surrounding the trench there was a 3 Log reduction in bacterial numbers and within the second foot counts are to the acceptable range for a fully treated wastewater." But again this degree of removal assumes the presence of a sufficiently fine-grained soil and/or sufficient crusting in the trench to assure unsaturated flow in coarser textured soils. Underscoring this point, Converse, et al. [35] reported incomplete bacteria removal at 3 feet below the trench in a silt loam soil, which they attributed to saturated flow conditions created by uneven distribution and consequent localized overloading.

These observations highlight the vulnerability of conventional soil disposal systems, especially in coarser-grained soils, which cover many of the sensitive watersheds in Wisconsin. Note in particular the critical function of the clogging mat in obtaining high bacteria removal. The manner in which this biomat is formed and maintained makes operation of a conventional system a delicate balancing act between good filtration and too much clogging, which would result in "hydraulic" failure of the field. [11,12,26,33] Thus, maintenance of unsaturated flow at all points in the field is problematic.

Once retained in the soil, pathogenic bacteria would eventually die off. Factors affecting their survival in soil are listed by Canter & Knox [11] and by Frankenberger [34] to include moisture content, moisture holding capacity, temperature, pH, presence of organic matter, and antagonism from soil microflora. Survival increases with soil moisture, indicating that injection of wastewater nearer the surface -- where ET losses would lead to lower moisture levels over much of the year -- would be detrimental to bacterial survival rates. Intermittent dosing, with alternating wetting and drying cycles, should also decrease survival. This effect may be minimal when employing a short-term dose/rest loading cycle, but it would be most accentuated in coarse-textured soils with low moisture holding capacity. Poor distribution in conventional gravity-dosed systems result in constant high wetness in those areas receiving the loadings, a factor that would be eliminated in a pressure-dosed system. Antagonistic microflora are likely to be more abundant in near-surface horizons, again favoring shallow placement.

Adsorption can also play a significant role in bacterial removal. Canter & Knox [11] state that this process "... appears to be significant in soils having pore openings several times larger than typical sizes of bacteria"; that is, in coarse-grained soils. Adsorption becomes increasingly effective with increasing clay content and organic fraction. [11,33] In many coarse-grained soil profiles, surface soils tend to have a higher clay content than lower horizons. As has been noted, the organic fraction of a soil profile is largely contained in the upper horizons. This implies that removal through adsorption would be more effective with near-surface disposal methods.

Reports by Reneau, et al. [4] and by Hargett [18] indicate the benefits of employing improved disposal methods. Several studies of septic tank effluent disposal in shallow LPD systems led to the conclusion that a separation of two feet or less between the trench and a limiting condition would achieve practically complete elimination of indicator bacteria. Mote and Buchanan [36] found that, using a modified field design in which measures were taken to preclude effluent transport through large soil pores, practically complete removal of bacteria from septic tank effluent was observed at a depth of 18 inches below the point of injection. Duncan, et al. [37] also observed complete elimination of fecal coliforms from septic tank effluent at the 18 inch depth in columns loaded in a manner which simulated an LPD system. A study of pressure-dosed mound systems also found that seepage at the toe of the mound, implying saturated conditions very near the sand/soil interface, contained very low indicator bacteria counts. [38]

This latter finding points out the benefits of pretreatment before disposal into the soil system. The mound is essentially a buried sand filter. It was stated by Anderson, et al. [39] that sand filters "... are capable of reducing total and fecal coliforms by 2 to 4 logs, producing effluent values ranging from 1,000 to 100,000 [for total coliforms] and 100 to 3,000 per 100 ml [for fecal coliforms]."

Achieving this level of reduction in the pretreatment system should allow a further reduction in depth to a limiting horizon in the soil system. Bouma [40] observes that "[c]olumn studies have indicated that two feet of sandfill and one foot of natural topsoil can remove pathogenic bacteria and viruses." This finding was the basis of the mound system, which is employed on sites with 1-2 feet of natural soil depth to a limiting condition. Also, Converse, et al. [35] found essentially complete removal of fecal coliforms at a depth of 12 inches in natural soil below a code standard pressure-dosed field receiving aerobically treated effluent containing fecal coliform counts on the order of 101-104 MPN/100 ml.

Duncan, et al. [37] investigated the impact of effluent quality on soil depth required to remove indicator bacteria from percolating effluent. They employed undisturbed columns of a loamy soil which were loaded in a manner that imitated an LPD system. A simulated trench bottom loading rate of about 0.5 gal/ft2/day was employed in the study. For those columns receiving sand filtered effluent containing a mean fecal coliform count of 170/100 ml, they found no coliforms remaining at only six inches below the simulated trench bottom. It is notable that this finding was made using an experimental setup which forced all effluent to percolate downward. The physical configuration precluded any lateral diffusion of effluent. As discussed previously, matric potential would effect significant dispersal from shallow, narrow LPD system trenches into the intertrench area.

To summarize, using near-surface disposal, light areal loading rates and uniform distribution methods should improve the capability of any soil type to remove bacteria from percolating water. Field design modifications to minimize flow through large soil pores would further decrease the potential for movement of bacteria with the percolating water. Also, pretreatment can significantly reduce the total amount of bacteria introduced into the soil system, decreasing chances for transport to groundwater regardless of the type of disposal system employed. Studies to date indicate that, if sand filter or equivalent pretreatment and proper disposal field design were employed, something like 12 inches of soil would suffice to essentially eliminate indicator bacteria from the percolate. Again noting the impact of dilution, it is extremely unlikely that wastewater systems designed in this manner would cause violations of NR 140 standards.

5.5 Viral Pathogens

There is no enforcement standard for viruses in NR 140. But viruses constitute a significant threat to public health. So it should be reasonably assured that wastewater management systems regulated by DILHR would not result in an increased risk of introducing viruses into groundwater.

Virus removal appears to be problematic, regardless of the type of system or the depth to a limiting condition. Most studies of virus removal have been conducted in "artificial" environments, and their applicability to septic tank drainfields is somewhat questionable. [33] Factors impacting upon the removal of viruses in the soil system are noted in several works. [11,12,26,34,41] They include flow rate, soil characteristics, pH, cations present, characteristics of the virus, and the content of soluble organic matter in the water.

Since viruses are extremely small, the primary removal mechanism in any type of soil is adsorption. [4,12,33,34] In fact, Frankenberger [34] states: "Virus removal from percolating wastewater is almost totally dependent on adsorption to various soil components." Good removal therefore depends on assuring that the factors noted above are favorable to adsorption and retention of virus.

Lower flow rates favor virus removal. Hagedorn, et al. [33] state that "... the most important characteristic in virus retention is effluent flow velocity." As with bacterial removal, maintaining unsaturated flow appears to be critical to good removal. This does not simply decrease local flow velocities, but also creates better contact between percolating water and potential adsorption sites. The further below field capacity that soil moisture level is maintained, the more intimate that contact becomes, enhancing the prospects for adsorption. Reports that adsorption increases with decreasing soil moisture confirm this point. [41]

Soil characteristics influencing virus movement include clay content, cation exchange capacity, chemical composition, and organic matter content. Higher clay content enhances virus retention due to the high cation exchange capacity and large surface area per volume of clays. It has also been observed that metal complexes readily sorb viruses to their surfaces, so the presence of iron oxides and the exchangeable aluminum content of soils increases adsorption. [11,34] Organic matter in soil also offers a large number of active sites for adsorption. Lacking the adsorption capacity provided by clays, organic matter is particularly important in sandy and silty soils. [26,34]

While soil organic matter enhances adsorption, organic matter in the effluent can retard adsorption. Green & Cliver [42] observed a lower retention in a sand that had been "conditioned" with septic tank effluent than in a clean sand. Lower adsorption was surmised to be due to the presence of soluble organic matter in the water, which competed with viruses for adsorption sites on the soil particles. This circumstance is reported to not only hinder adsorption of viruses, but also to result in elution of those already adsorbed. [11,12,34] This points out the benefit of pretreatment to reduce organic content of the applied effluent.

Ionic strength of the water is also a factor in virus retention. It appears that cations neutralize or reduce the repulsive electrostatic potential between viruses and soil components, thus favoring adsorption. [11,12,26,34] Conversely, low ionic strength hinders adsorption and can even cause desorption. A result is that it is always possible for previously adsorbed viruses to be eluted by low ionic strength rainwater during heavy rainfalls, allowing them to be transported with the percolating rainwater. [4,26,34]

Lower pH favors virus adsorption. This is explained by Gerba & Goyal [12] as being due to the impact of pH on the net charge of viruses and soil components. Soils tend to be electronegative at neutral pH, so adsorption is not favored, since viruses are also negatively charged particles in this pH range. However, they state that, if pH of the water is lowered, "... protonation causes decreased ionization of virion carboxyl groups and increased ionization of amino groups. As a result, viruses become less electronegative or even electropositive at lower pH levels." Gerba & Goyal also observe, however, that pH dependence is not clear cut, because various soil components display different isoelectric points, and there is a lack of information on the isoelectric points of many pathogenic viruses. For present purposes, it is worth noting that nitrification of ammonium would tend to depress pH of soil water, perhaps favoring increased adsorption of at least some viruses.

Available studies generally indicate that good virus removal is typically achieved after a fairly short length of travel. Results of investigations reported by Frankenberger [34] showed high degrees of removal by filtration through 2 feet or less of a variety of soil types. Green & Cliver [42] found that, subject to maintenance of favorable conditions, two feet of sand column were sufficient to achieve high degrees of seeded virus removal. Also, Hargett [18] reports that studies of LPD systems indicated that two feet of soil was adequate to remove viruses. This is not guaranteed, however. A study in Florida reported by Anderson, et al. [43] indicated that, although there was probably a low frequency of occurrence, viruses were recovered from groundwater after filtration of septic tank effluent through four to five feet of sandy soil.

A "wild card" in any prediction of virus removal capability is the variability among viruses. A lack of uniformity in removal rates of different viruses is noted by several works [11,12,34,41]. Gerba & Goyal [12] report studies where, under similar conditions, virus adsorption efficiency ranged from 0 to 99.9% for various types of viruses.

Review of all factors impacting upon virus removal suggests that alternative practices may enhance removal. Pretreatment to reduce BOD5 to low levels before disposal would enhance adsorption. Near-surface disposal is favored because clay content of coarse-grained soils is typically highest in the surface horizons, and because soil organic matter content would be higher there in all soils. Pressure distribution would enhance removal by maintaining low flow velocities at all points in the field, as opposed to the localized overloading which typically occurs in a conventional system. Intermittent loading of wastewater also tends to enhance virus immobilization, an effect which is probably related to the varying moisture states so created. [12,34]

The sand filter pretreatment system should also directly reduce the mass loading of viruses into the soil system. Studies indicate that viruses are effectively removed by filtration through fine sands. [44,45] Because of practical operational considerations, coarser filter media would be employed in sand filter systems such as those proposed by the Town of Washington. However, the same processes acting in the soil would be effective to some degree within the sand filter bed, so some removal capability should still be expected.

Due to variability in viruses and to the possibility for elution subsequent to adsorption, complete removal would always be an uncertain proposition in any type of system and with any depth of unsaturated flow to a limiting condition. But it can be concluded that alternative treatment and disposal systems like those reviewed in this report would generally maximize the prospects for virus removal. Because the factors impacting on removal efficiency are related to the character of the soil environment and are only obliquely dependent on the length of the flow path, this conclusion is relevant where the depth to a limiting condition in these alternative systems is less than the code minimum of three feet for on-site systems in Wisconsin.

5.6 Trace Organics

As used here, the term "trace organics" refers to potentially toxic synthetic organic chemicals. These substances can be introduced into domestic wastewater streams by household cleansers, by septic tank cleaning agents, by washing out pesticide or herbicide containers in the sink, etc. Concern about these chemicals has increased as improved analytical methods have been able to detect their presence in waters, down to levels on the order of one part per billion. NR 140 set limits on trace organics which range as low as 0.00007 micrograms/liter. In the soil system, trace organics may be removed from effluent water by sorption, volatilization, and biological degradation. [26,46,47]

Based on several observations, biological degradation appears to be the most important removal mechanism. Nellor, et al. [46] concluded this from observing rapid infiltration systems. Crites [26] reported studies where 10-90% of various compounds were degraded in the natural soil system, but when the soil was autoclaved, there was no degradation. Wilson, et al. [48] observed that organics which were removed when infiltrated with wastewater moved readily through the soil when infiltrated with tap water. Likewise, Sauer & Tyler [49] found that the concentration of volatile organic chemicals was lower in leachate when it was added to their system with septic tank effluent than when added with tap water. Bicki & Lang [50] found almost complete removal of atrazine and alachlor by conventional septic system trenches in silt loam. They stated that removal appeared to be due mainly to retention and subsequent degradation in the clogging mat.

These results also indicate that adsorption may aid in biological degradation by immobilizing the chemicals and thus providing time for degradation to be more complete. As outlined previously in regard to other pollutants, near-surface layers would contain more adsorption sites, especially in the coarser-grained soils. The biological degradation process itself should also be enhanced by injection of effluent into the biologically active near-surface horizons.

Volatilization may also play a significant role. While a great deal of volatilization within the soil may not be possible, due to restricted air/water contact, even a small amount could be significant to transport of elements present in trace quantities. Chang & Page [47] note that mass transfer through volatilization depends on not only the partition between water and air, but also on movement of the chemical to the soil surface after it is volatilized. Again, near-surface disposal in horizons which are typically more aerated than deeper soils should enhance whatever potential there is for volatilization losses.

Biological treatment processes can remove trace organics from wastewater. [26,47] Removal is accomplished through the same mechanisms which operate in the soil system: volatilization, adsorption onto organic particles, and biological degradation. One study has reported complete removal over a three year period of toxic organics found in septic tank effluent by passage through a sand filter treatment system. [51]

5.7 Trace Inorganics

The trace inorganic substances of most concern are heavy metals. Tables 1 and 2 in NR 140 set limits on several metals of concern to public health and welfare. While their occurrence appears to be sporadic, studies have found the presence of some metals in septic tank effluent. [51] Typically, they are not present in appreciable quantities in the domestic waste stream, but certain circumstances -- e.g., aging interior plumbing systems -- may cause some of them to be present in concentrations well in excess of the NR 140 limits.

Whatever amount present is expected to be removed in the soil system. [52,53] Long-term accumulation to problematic levels in the soil is also not likely, due to low input concentrations and relatively low hydraulic loading rates. Crites [26] and Page & Chang [53] state that sorption processes are the principle removal mechanism, and these processes are likely to be more effective in the near-surface horizons. In contrast, however, other sources indicate that trace metals would be present as finely divided solids so that filtration would be the dominant mechanism. [52] If filtration were the dominant removal mechanism, a longer travel path would enhance removal. However, near-surface disposal may be more effective than percolation through a greater depth of coarse-textured soil, since, as noted previously, coarse-grained soils tend to be finer-grained in the shallower horizons.

5.8 Summary

The foregoing discussions provide strong support for alternative wastewater management practices, such as sand filter treatment and LPD disposal systems, as a means of coping with limited soil resources. For all categories of pollutants, it was observed that high quality pretreatment and improved disposal methods would maximize the pollutant attenuation and removal capabilities of the soil system. In each case, these capabilities would be enhanced by employing disposal systems with three key features:

(1) Placement of effluent in the near-surface horizon of the existing soil profile;

(2) Low areal loading rates; and

(3) Uniform distribution with a dosing and resting cycle.

For all categories of pollutants except bacterial and viral pathogens, additional soil depth beyond the existing near-surface horizon does not appear to appreciably enhance removal. For bacterial pathogens, it appears that 12 inches of suitable soil below the point of effluent injection would be sufficient when sand filter pretreatment is employed. For viruses, though complete removal is problematic under any circumstances, this same 12-inch depth should provide highly reliable removal when factors favoring it are optimized.

 

6. TOWN OF WASHINGTON PROJECT DISPOSAL FIELD PERFORMANCE

As a part of the project demonstrating the performance of sand filter/LPD systems, the Town of Washington attempted to monitor percolate quality at about one foot below the infiltrative surface. Because of difficulties encountered in attempting to collect vadose zone samples, only one of the demonstration project disposal fields had a monitoring device that was likely to have captured effluent which had percolated through fairly undisturbed soil. (Another effort is under way, being directed by Wiersma and Stieglitz at the University of Wisconsin-Green Bay, which is expected to provide more reliable data.) A look at the results from that system provides an indication of the effectiveness of that approach. It also offers clues as to how disposal field function can be improved.

The disposal field for this system was installed in sandy soil. In this soil, it was possible to dig a pit beside the disposal field and to drive a catchment trough through the wall of the pit under a portion of one of the field trenches. The trough was fitted with a drain pipe flowing into a monitoring port. Figure 6 illustrates the monitoring device. Theoretically, the trough would intercept effluent -- and percolating rainfall or snowmelt -- flowing from about a one-foot length of the trench during periods when matric potential was inadequate to keep this water from percolating out of the near-surface soil horizon.

Figure 6 Richter Field Monitoring System (36K)

This disposal field was constructed essentially as illustrated in Figure 4. The trench bottoms were about one foot below the natural ground surface. The entire field area was covered with about eight inches of imported sandy loam to provide increased cover over the pipes for freeze protection. Two monitoring devices were installed, one in the middle of one of the outer trenches and one at the end of one of the inner trenches. The layout of the field and location of monitoring devices is shown on Figure 7.

Figure 7 LPD Disposal Field Layout (24K)

It was intended to place the monitoring trough 12 inches below the trench bottom. However, it was very difficult to keep the trough in the desired position while driving it into the pit wall. Because the trough needed to be sloped toward the pit to allow intercepted percolate to drain into the monitoring port, it had to be driven into the soil at a slight upward angle. As a result, the troughs ended up being less than 12 inches below the trench bottom. The exact depth for each of the troughs was not determined at the time of installation.

Insertion of the troughs caused the soil to "heave", which opened up soil macropores along structural lines or created new cracks. Therefore, the soil between the trough and trench bottom was not truly undisturbed. This problem was more severe for monitoring port no. 1, for which the trough was driven in perpendicular to the trench line.

From the soil reports it can be seen that trench bottoms in this field were at or below the depth of significant root penetration, which would indicate that organic matter of the soil below the trench is quite low relative to that in soil nearer the original surface. In addition, based upon available water capacity and typical sieve analysis listed in the Door County soil survey [54] for soils on this site, soil at the depth of the trench bottoms is coarser-grained than that nearer the original surface.

With this background on monitoring conditions, the results of tests conducted on samples from the monitoring ports are reviewed. For nitrates, 8 samples were obtained from Port No. 1, and 10 samples were obtained from Port No. 2. The average total nitrogen concentration in system effluent (all of it likely to be converted to nitrate in the soil system) over the monitoring period was 17.4 mg/L. Excluding the system startup period prior to November of 1992, the average drops to 11.2 mg/L. Average nitrate concentration in the samples from Port No. 1 was 5.4 mg/L. One sample exceeded the NR 140 enforcement standard. For Port No. 2, the average nitrate concentration among the samples was 0.7 mg/L. No sample from Port No. 2 exceeded even the preventative action limit.

This disparity in nitrate concentrations between samples from the two ports indicates the possibility that some of the effluent collecting in Port No. 1 was "mainlining" through soil macropores. Note that the "heaving" problem during installation appeared to be more severe for Port No. 1. It is also interesting to note that the four samples collected from this port in the fall of 1993 and the spring of 1994 had nitrate concentrations at or below the preventative action limit, while the four samples collected in the fall of 1992 and the spring of 1993 had concentrations approaching or exceeding the enforcement standard. This circumstance may indicate that the disturbed soil had "settled" and the macropores were no longer such effective conduits.

In any case, the low frequency of sample availability in these ports indicates that the basic design theory of shallow LPD systems is being realized in practice. This can be shown by the following analysis. The average daily flow from this system over the entire monitoring period was 134 gpd, which yields an average loading rate onto the disposal field of 0.11 gal/ft2/day. Since field design procedure assumed that each trench "feeds" a 4-foot wide strip of soil, a one-foot long section of trench would be loaded with about 0.44 gallons per day at the observed field loading rate. If sampling were conducted once per week, about 3 gallons should collect in the monitoring port if all of the applied effluent percolated directly downward.

Clearly then, much of the time the effluent is being "diffused" throughout the intertrench spaces and is being held in the soil by matric potential. That no water collected in the ports most of the time indicates the effectiveness of this field design at dispersing the effluent and at eliminating a great deal of it through evapotranspiration. This would result in vastly lower mass loadings of nitrates percolating to groundwater over the annual cycle even if that which did percolate were to do so without any removal in the soil system. The monitoring results indicate that concentrations in percolate typically are greatly reduced by some combination of removal and dilution.

In the case of coliform indicator bacteria, 20 samples were obtained from Port No. 1, and 13 samples were drawn from Port No. 2. Treatment system effluent coliform counts were generally in the range of 102-105 CFU/100 mL, with a majority of counts below 104. Out of the samples from Port No. 1, 11 were non-blank, with 8 of them having counts on the order of 102 or greater. Of the samples from Port No. 2, only three were non-blank, and only one of these had a significant, quantifiable count (28 CFU/100 mL). Again this disparity between samples from the two ports indicates the possibility of significant macropore flow above the trough feeding Port No. 1. Here also, counts in Port No. 1 samples were typically much lower after the spring of 1993, indicating the possibility that soil consolidation had reduced macropore flow from that point on.

It is also noted that sampling from these ports was not executed in a manner that would minimize the potential for errors due to sample contamination. Because the presence of a sufficient volume of water in the port to provide a sample was a "hit or miss" proposition, the port was not evacuated and disinfected prior to each sampling event. Further, no tests were conducted to determine "background" coliform level. Therefore it may be incorrect to presume that the results observed indicate inability of the system to achieve essentially complete removal of coliform indicator bacteria from treatment system effluent.

Whatever the true efficiency of this disposal field at eliminating nitrogen and coliforms from system effluent, these results were observed from a field design that can be significantly improved upon. Especially in this coarse-textured soil, introducing effluent higher up in the A horizon is expected to be critical to obtaining optimum performance, as the foregoing discussions have pointed out. Further, those discussions indicate that taking measures to minimize macropore flow -- even when that flow is not "enhanced" by the insertion of a monitoring device into the soil -- is certain to improve removal efficiencies. Such design improvements are discussed in the next section.

 

7. MODIFIED AT-GRADE LPD DISPOSAL FIELD DESIGN

Based upon available information about soil attenuation and removal mechanisms, it appears possible to derive a disposal field design which would maximize those mechanisms for situations where soil resources are limited. As noted previously, in addition to the advantages gained by pretreatment to a higher quality before disposal, maximum performance should be obtained by employing three key features: shallow disposal, low areal loading rate, and uniform distribution with a dose/rest loading cycle. The latter is a straightforward hydraulic design problem, already addressed in the current Wisconsin code. Achieving a low areal loading rate is a simple matter of properly sizing the field. So shallow disposal is the major feature remaining subject to design innovations. Along with mechanisms for "manipulating" soil hydraulics, this principle is employed to generate new designs for LPD disposal fields. Those designs are illustrated in Figures 8, 9 and 10.

The basic design concept is predicated largely upon the work of Mote and Buchanan [36], which they styled as a comparison of the Wisconsin at-grade concept with a below-grade concept similar to that shown in Figure 4. Mote and Buchanan evaluated the concept in a situation where there was only 18 inches of soil above a limiting condition (bedrock interface in that particular circumstance). Keeping in mind that the applied wastewater was septic tank effluent -- rather than highly treated sand filter effluent -- they observed a high degree of indicator bacteria removal at that depth. (Numerical results were not reported. This is based upon a graphical representation of the data.) Previous discussions have already noted that, when high quality sand filter effluent is applied to the field, 12 inches of soil is expected to be adequate to achieve effective removal of bacterial pathogens, provided that the available soil is optimally employed. The modified at-grade designs discussed here are variations on the theme presented by Mote and Buchanan which promote optimal use of the available soil.

Figure 8 shows the basic design for sites where at least 12 inches of suitable soil above a limiting condition is available. Rather than a "trench" as it is normally conceived, the lateral lines are placed on a gravel mound which creates a "loading envelope." This is similar to the construction method for the Wisconsin at-grade system. With the effluent being clarified by high quality pretreatment -- so clogging of the infiltrative surface would not be a problem -- a smaller envelope using fine gravel media is employed. Installation is easier with the finer gravel, since it can be placed in one step. The pipe can be laid on the gravel, and after system hydraulics are set up and tested, it can be "worked" into the top of the gravel mound. This avoids a second distribution of gravel to cover the pipe after hydraulic testing.

Figure 8 Modified At-Grade Concept Disposal Field (48K)

Note the stipulation in Figure 8 that the existing surface be tilled. Mote and Buchanan assert that this destroys large pores throughout the tilled depth, which would force flow out of the trench to be in more intimate contact with soil particles. Once so "distributed" through the micropores, the water is unlikely to "regroup" into macropores if it percolated beyond the tilled depth since even small matric potentials would strongly retard this.

Mote and Buchanan also claim advantages for the broader distribution of effluent over a horizontal infiltration surface which their at-grade system afforded. Their arguments are rooted however in the assumption that water moves only downward under the influence of gravity. (With septic tank effluent being applied, their strategy would also lower the "face" loading rate, but this would be of minimal concern with sand filter effluent.) As has been demonstrated in the Town of Washington project, matric potential can be relied upon to "disperse" the effluent laterally through the root zone from a narrower loading envelope, even to the extent that deep percolation losses are greatly attenuated over a large part of the year. By assuring that effluent is "absorbed" directly into micropores rather than exiting the loading envelope through macropores, the effects of whatever matric potential is available would be maximized.

This at-grade field design spreads the effluent onto existing surface soils. As the foregoing discussions detailed, this takes maximum advantage of the organic matter and the generally more fine-grained texture of the surface horizon, which is expected to enhance several removal/assimilation mechanisms. This strategy also utilizes the full depth of the existing natural soil for treatment.

After the effluent distribution system is in place, the field area is covered with fill soil. To maximize matric potentials at the point of effluent injection, this cover layer should be as thin as practical. The major limitation on depth of cover is freeze protection. The 12-inch minimum specified in Figure 8 has been observed to be sufficient to preclude lateral line freeze-up in the Town of Washington demonstration systems. If the management system could assure that the field surface was mulched in the fall, or if the field served a seasonal user, the cover could be thinner. Thinner cover would offer the additional advantage of greater root penetration to the depth of effluent injection.

A loading rate (counting the space between laterals) in the range of 0.15-0.25 gallons/ft2/day on fields of this design is recommended to provided the low areal loading rate that previous discussions indicate is critical to good removal of various pollutants. The 4-foot lateral line spacing shown in Figure 8 is an economic compromise. Obviously, using additional laterals more closely spaced (or, more correctly stated, crediting a narrower strip of field area to each linear foot of lateral pipe, thus increasing total required lateral length) would dictate that loading per foot of lateral be lower. With this being the case, smaller matric potentials would be effective at dispersing effluent throughout the credited area. In the Town of Washington demonstration project, 4-foot spacings appeared to provide effective dispersal (and greatly limited deep percolation losses) even in a fairly coarse-textured soil.

Figure 9 illustrates the same design concept, except that imported fill soil is substituted for some depth of natural soil. This may be employed on sites where less than 12 inches of existing soil above a limiting condition is available. It is suggested that fill soil be "traded out" for natural soil depth at a rate of 1.5 inches of fill for each 1-inch reduction of natural soil depth. This concept may also be employed where the existing soil, though at least 12 inches deep, is very rocky or sandy to provide additional depth to compensate for the less efficient removal/assimilation capabilities of the existing soil.

Figure 9 Partial Fill Concept Disposal Field (42K)

The rationale for relying on a fill layer to provide treatment goes back to Mote and Buchanan's thesis that tilling the existing surface soil destroys macropores and assures that effluent would be dispersed into and would percolate through micropores. The process of digging, hauling and placing the fill soil constitutes, in effect, a very thorough tilling of this layer. Thus, the fill layer, assuming it is of suitable character, may even provide better treatment than an equivalent depth of natural soil.

Others have offered evidence that a fill layer does indeed enhance treatment of percolating effluent. Mote and Buchanan, citing a work by Bouma, et al. [55], state, "They observed that a surface layer of sand permitted surface-applied water to move through both small and large pores, as opposed to preferentially moving in large pores which occurred without the sand." Converse, et al. [56] studied fecal coliform removal in mound systems and concluded that "... it appears that the sand fill plus the native structured soil provides a better treatment than native structured soil under pressure distribution conditions with minimal clogging mat development." Reneau, et al. [57] investigated septic tank effluent disposal in an LPD field composed entirely of fill, observing that "... 0.76 m [2.5 feet] of soil-fill was effective in reducing fecal coliforms to background levels." This is very similar to observations of coliform removal in natural soils.

A mound system employs sand as the fill material because influent wastewater is septic tank effluent. The mound is, in effect, a buried sand filter. With high quality sand filter effluent being applied to the field, absorption and transmission by a finer-textured fill layer would not be problematic, as it may be with septic tank effluent. Logic dictates that a loamy soil fill, being finer textured, would provide even better treatment of percolating water than would sandy fill. In addition, higher matric potential would be maintained in a finer-grained soil, so that better effluent dispersal would be provided and water retention would be enhanced. Thus, not only would filtration be more effective, but all the removal/assimilation mechanisms which are enhanced by retention in the biologically active near-surface soil horizons would be maximized as well.

If the fill soil were sterile subsoil material mined from a pit, organic matter should be added to it. This could be accomplished by incorporating compost into the fill material. The importance of soil organic matter to some removal/assimilation mechanisms -- especially in soils with low clay content, which is often the case with pit run soil -- was noted in previous discussions. In addition, percolating effluent would be exposed to another zone of high organic matter content when it flowed into the native soil layer.

In Figure 10, an alternative concept for the effluent distribution system is illustrated. A highly porous synthetic material (Enkamat as shown in Figure 10 is one possibility) would replace gravel to provide the storage zone for effluent when it is dosed onto the field. A detail of the lateral pipe/Enkamat assembly is displayed in Figure 11. Having a much higher void volume ratio than gravel, this storage zone would be more compact; which would further increase the "face" loading rate on the infiltrative surface. But again, given the low areal loading rates being proposed, absorption into the soil should not be a problem with high quality sand filter effluent.

Figure 10 Modified At-Grade Concept Disposal Field with Drainline Laterals (42K)

Figure 11 Drainline Lateral Pipe System (42K)

This method of construction should enhance dispersal of effluent laterally through the root zone. The maximum angle of repose of trench gravel dictates that the gravel envelope must have a broad base. The effects of gravity in turn dictate that this basal face would be the major infiltrative surface. Assuming uniform distribution across this face, there would be little moisture gradient in the soil immediately below the gravel over the face width. This concept is illustrated in Figure 12. Because of this, matric potential would be relatively ineffective at drawing moisture laterally -- so dispersing this flow -- except near the edges of the gravel envelope. The concept shown in Figure 10 would create a narrower basal face and would therefore maximize lateral dispersal.

Figure 12 Flow Out of Loading Envelope (30K)

A further benefit of this distribution system is that the height of the pipe above the existing soil surface would be reduced because of the smaller storage zone. Setting up on top of the gravel mound, the lateral pipe in the concept shown in Figure 8 would be some distance above the existing soil surface. Therefore, a higher total depth of fill soil cover would be required to obtain any given depth of pipe cover than would be required for the concept shown in Figure 10. Again, this reduced depth of cover over the existing soil is expected to enhance the positive impacts of whatever matric potential is available.

A practical advantage of this concept is ease of construction. The lateral pipe/Enkamat assemblies could be made up in a shop and easily connected in the field. The most labor intensive part of field construction would be placing the gravel. Because the gravel needs to be placed in a narrow line, this process is typically executed as a "bucket brigade" operation. The advantages of eliminating that process are obvious. (Unfortunately, however, the cost of Enkamat is so high that it more than offsets labor cost savings. A search for less expensive materials to serve its function is under way.)

It is relatively economical to provide a lower areal loading rate with the field design concepts proposed here. Once all the equipment and materials are marshaled, adding a lateral or two to the field imposes a relatively small additional effort. Employing the concept shown in Figure 10, this additional effort is minimized further still. Also, while it would increase materials costs somewhat, using this alternative distribution system would allow closer lateral spacings -- should they be found to enhance system function -- to be installed without much additional effort.

 

8. DRIP IRRIGATION -- THE ULTIMATE DISPOSAL FIELD

The concepts shown in Figures 8, 9 and 10 emulate a drip irrigation system. In fact, when any shallow LPD disposal field is loaded very lightly, it operates like a "pseudo-drip" irrigation system. The trench acts as a continuous, low efficiency "emitter", holding the effluent until it can be absorbed through the trench wall and bottom into the soil. However, a large "slug" of water is made immediately available for absorption, so there is no control of the rate at which the soil is wetted. With little control on the degree of saturation, the likelihood of significant deep percolation losses increases.

If high quality pretreatment is employed to produce a "clean" effluent, use of true drip irrigation hardware would be practical. Drip emitters typically flow at rates in the vicinity of one gallon per hour, so they provide a very slow, controlled wetting of the soil, which minimizes the potential for losses to deep percolation. Drip irrigation fields are typically designed with emitters on relatively short spacings, so that good dispersal of effluent is readily obtained even when there is little matric potential. Further, it would be very cost efficient to provide additional field area in a drip irrigation system, affording lower areal loading rates with a relatively minimal increase in costs.

In short, using drip irrigation hardware maximizes all the benefits which the LPD designs just described aim to provide. Other advantages which are specific to drip irrigation hardware include:

* The irrigation function is very efficient and beneficial reuse of effluent water and the nutrients it contains can be maximized. Low areal loading rates based on irrigation demands are typically employed. This maximizes loss of effluent water by evapotranspiration at the expense of deep percolation, and it maximizes nutrient uptake.

* The water is spread evenly over a large area. The emitters are fairly closely spaced, and a small amount of water flows out of each emitter each time the field is dosed. That, plus the low instantaneous flow rate out of each emitter, ensures unsaturated flow of any water which does percolate.

* Drip irrigation lines are typically installed with little cover. As noted previously, shallow disposal helps to maximize pollutant attenuation mechanisms in the soil system.

Of course, in the winter climate of Wisconsin, freezing of water in the drip lines may be an operational problem. Also, as discussed below, some maintenance of drip systems may be required to combat both biological and chemical clogging of emitters. Still, especially for seasonal use and larger installations, drip fields may be a workable option. Two of the Town of Washington demonstration project systems which served seasonal users employed drip irrigation disposal fields. Management strategies like mulching the field in the fall may preclude significant freezing problems for year-round installations as well.

Drip irrigation disposal fields are typically much easier to install than LPD fields. The drip lines may be laid in a narrow, shallow trench or inserted directly into the soil with a vibrating plow. The latter is an especially useful feature when drip lines are being installed in an area with existing turf cover that the system owner wants to preserve. Another option is to lay drip lines on the existing surface and cover the field area with fill soil, just as in the LPD designs. Since the soil is wetted very slowly by a drip irrigation field, it is not critical that the existing surface be tilled to break up the macropores. Where little depth to a limiting condition is available, placing the drip lines on a tilled surface -- or upon a fill layer -- would enhance soil treatment capabilities in the same manner as described for the LPD systems.

Wastewater drip irrigation fields should employ pressure compensating turbulent flow emitters to minimize clogging problems and to provide a fairly even flow rate in lines that are at different elevations. The field typically operates at a pressure of at least 15 psi. Drip hose generally used in wastewater systems is manufactured with emitters built in on 2-foot spacings. A typical layout employs 2-foot spacings between hoses, so one emitter per 4 ft2 of field area is provided, on average.

Installation details of a drip irrigation reuse/disposal system are illustrated in Figure 13. Even though high quality pretreatment is provided, strainers are installed at entries to the drip lines as a "fail-safe" device. Typically a field is designed with multiple entries. Dividing the total instantaneous flow rate among more than one strainer minimizes head loss through it. Also, plumbing the field in this manner provides parallel flow paths. The rest of the field could be loaded while one area is valved off to make repairs or modifications, allowing the system to continue to operate while the work is being done.

Figure 13 Typical Installation of Drip Irrigation Field (30K)

The downstream end of the lines fed through each strainer are plumbed together in a header pipe leading to a flush valve. Employing these flush valves minimizes potential for clogging of emitters. Any debris in the pipes -- such as biological growths on the pipe walls which are sheared off by the incoming dose of effluent -- washes down to the end of the line and flows out the valve, rather than being trapped in the hose and forced into the emitters. The flush valves seal when pressure builds up to about 5 psi, so only a very small amount of water is flushed out each time the system is pressurized. This is quickly absorbed by the soil underlying the valve box. The bottom of the box is lined with gravel to form a storage zone for this flow until it can be absorbed. In effect, this gravel bed is a "mini-drainfield" for flow out of the flush valve.

If drip fields are to be used during the winter in Wisconsin, the entry strainers and flush valves must be freeze-protected. Perhaps the best way to assure this is to contain them within the treatment system tanks. A small pipe from the end of the drip field could be run back to the tank -- in the same trench as the effluent feed line runs -- with the flush valve installed on the end of it inside the tank. This offers the additional advantage of having a long pipe run without any emitters at the end of the line. Any debris in the drip lines would be washed down the system into this pipe, further minimizing the potential for emitter clogging.

Although drip irrigation fields offer significant advantages over LPD fields in regard to both treatment effectiveness and practical installation, these systems do require some routine maintenance which is avoided by using LPD systems. Periodically, the strainer screens must be cleaned, and the hoses should also be thoroughly flushed from time to time. It may also be necessary to periodically dose the system with chlorine to clear out biological clogging and with acid to clear out chemical clogging. The latter occurs when water evaporates out of the emitters between doses, leaving behind the dissolved solids. Unless the source water is very low in dissolved solids, this is a generic problem in drip irrigation systems even when they are fed with tap water. Fortunately, however, all these maintenance activities are required quite infrequently, and the system can be designed so they are fairly easy to execute.

 

9. SUMMARY AND CONCLUSIONS

When the new performance-based DILHR code goes into effect, it will be important for system designers and regulators to have a greater understanding of how and why disposal fields perform, or fail to perform. Recognizing the mechanisms by which various categories of pollutants are removed in the soil system, professionals in this field will be better equipped to choose those management methods which are equal to the constraints of each site. The information in this report offers a detailed understanding of the benefits of various types of disposal field designs.

Of particular note are decentralized wastewater management strategies like that proposed by the Town of Washington. To implement these strategies, approval must be granted to install soil disposal systems on sites with limited soil resources. This report has outlined the conditions required to minimize the impact of these systems on groundwater quality and has proposed methods expected to accomplish that aim. It has been shown how pretreatment to a high quality -- including significant nitrogen removal -- not only reduces mass loadings of pollutants onto the disposal field but also enhances some removal/assimilation mechanisms in the soil. It has also been shown how alternative disposal methods can further improve the capability of the soil system to attenuate pollutants. Employing these methods would allow reduced depth of soil between the point of effluent injection and a limiting condition without compromising groundwater quality.

The basic standard suggested for wastewater systems employing these alternative methods is a minimum of 12 inches of suitable soil above a limiting condition. As detailed in this report, the only pollutant for which there persists any significant degree of uncertainty about the adequacy of this standard is pathogens. But a number of investigations have indicated that, employing sand filter pretreatment and an LPD disposal system, practically complete bacterial removal can be routinely expected at a depth of 12 inches or less in a suitable soil. Further, some of this minimum depth can be composed of fill, and a suitable fill material may actually improve the removal of pollutants.

An effort by UW-Green Bay is currently under way to obtain more reliable data on vadose zone water quality under disposal fields in the existing alternative treatment and disposal systems on Washington Island. One new system, employing a field designed as illustrated in Figure 8, may also be sampled. However, due to dilution and potential attenuation further along the flow path, data on vadose zone water quality would always be of questionable applicability toward determining compliance with NR 140. It would be advantageous in this regard to monitor a very shallow groundwater table below the field. It is suggested that systems of this type be installed on such sites and a monitoring program be executed. By sampling directly from the groundwater pool within or immediately adjacent to the disposal field area, it is much more likely that the true impact on groundwater -- and thus, the true degree of compliance with NR 140 -- would be observed.

 

CITATIONS

  1. Wisconsin Administrative Code, Chapter ILHR 83, "Private Sewage Systems."

  2. Wisconsin Administrative Code, Chapter NR 140, "Groundwater Quality."

  3. E. J. Tyler, R. Laak, E. McCoy and S. S. Sandhu, "The Soil as a Treatment System", Home Sewage Treatment, Proceedings of the Second National Home Sewage Treatment Symposium, ASAE Publication 5-77, 1977, pp. 22-37.
  4. R. B. Reneau, Jr., C. Hagedorn and M. J. Degen, "Fate and Transport of Biological and Inorganic Contaminants from On-Site Disposal of Domestic Wastewater", Journal of Environmental Quality, Vol. 18, No. 2, 1989, pp. 135-144.
  5. M. D. Harper, M. S. Hirsch, C. R. Mote, E. M. Rutledge, H. D. Scott and D. T. Mitchell, "Performance of Three Modified Septic Tank Filter Fields," On-Site Wastewater Treatment, Proceedings of the Third National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 1-82, 1982, pp. 187-196.
  6. J. C. Converse, J. L. Anderson, W. A. Ziebell and J. Bouma, "Pressure Distribution to Improve Soil Absorption Systems,"Home Sewage Treatment, Proceedings of the National Home Sewage Disposal Symposium, ASAE Publication PROC-175, 1975, pp. 104-115.
  7. M. E. Ver Hey and W. W. Woessner, "Documentation of the Degree of Waste Treatment Provided by Septic Systems, Vadose Zone and Aquifer in Intermontane Soils Underlain by Sand and Gravel," On-Site Wastewater Treatment, Proceedings of the Fifth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 10-87, 1987, pp. 77-86.
  8. R. J. Otis, G. D. Plews and D. H. Patterson, "Design of Conventional Soil Absorption Trenches and Beds", Home Sewage Treatment, Proceedings of the Second National Home Sewage Treatment Symposium, ASAE Publication 5-77, 1977, pp. 86-99.
  9. R. J. Otis, J. C. Converse, B. L. Carlile and J. E. Witty, "Effluent Distribution", Home Sewage Treatment, Proceedings of the Second National Home Sewage Treatment Symposium, ASAE Publication 5-77, 1977, pp. 61-85.
  10. C. G. Cogger and B. L. Carlile, "Field Performance of Conventional and Alternative Septic Systems in Wet Soils", Journal of Environmental Quality, Vol. 13, No. 1, 1984, pp. 137-142.
  11. L. W. Canter and R. C. Knox, Septic Tank System Effects on Ground Water Quality, Lewis Publishers, Inc., 1985.
  12. C. P. Gerba and S. M. Goyal, "Pathogen Removal from Wastewater during Groundwater Recharge", Chapter 9 in Artificial Recharge of Groundwater, Takashi Asano, ed., Butterworth Publishers, 1985, pp. 283-317.
  13. C. Cogger, B. L. Carlile, D. Osborne and E. Holland, "Design and Installation of Low-Pressure Pipe Waste Treatment Systems", UNC Sea Grant College Publication UNC-SG-82-03, 1982.
  14. B. L. Carlile, "Use of Shallow, Low-Pressure Injection Systems in Large and Small Installations", Individual Onsite Wastewater Systems, Proceedings of the NSF Sixth National Conference, N. I. McClelland, ed., Ann Arbor Press, 1979, pp. 371-385.
  15. M. P. Ronayne, R. C. Paeth and T. J. Osborne, "Intermittent Sand Filter Design and Performance -- an Update", paper produced by Oregon State Department of Environmental Quality (received from Ronayne in January 1991, publication containing this paper not determined).
  16. R. L. Siegrist, "Hydraulic Loading Rates for Soil Absorption Systems Based on Wastewater Quality", On-Site Wastewater Treatment, Proceedings of the Fifth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 10-87, 1987, pp. 232-241.
  17. D. Venhuizen, "Exploration of Treatment Technology and Disposal System Alternatives", paper written in support of Town of Washington (Wisconsin) Wastewater System Feasibility Study, March 1991.
  18. D. L. Hargett, "Performance Assessment of Low Pressure Pipe Wastewater Injection Systems", On-Site Wastewater Treatment, Proceedings of the Fourth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 7-85, 1985, pp. 131-143.
  19. T. L. Loudon and G. L. Birnie, Jr., "Performance of Trenches Receiving Sand Filter Effluent in Slowly Permeable Soil", On-Site Wastewater Treatment, Proceedings of the Sixth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 10-91, 1991, pp. 313-323.
  20. J. C. Lance, "Nitrogen removal by soil mechanisms", Journal of the Water Pollution Control Federation, Vol. 44, No. 7, July 1972, pp. 1352-1361.
  21. A. M. Petrovic, "The Fate of Nitrogenous Fertilizers Applied to Turfgrass", Journal of Environmental Quality, Vol. 19, No. 2, 1990, pp. 1-14.
  22. R. Laak, "On-Site Soil Systems, Nitrogen Removal", Alternative Wastewater Treatment, A. S. Eikum and R. W. Seabloom, eds., D. Reidel Publishing Co., 1982, pp. 129-143.
  23. F. E. Broadbent and H. M. Reisenauer, "Fate of Wastewater Constituents in Soil and Groundwater: Nitrogen and Phosphorus", Chapter 12 in Irrigation with Reclaimed Municipal Wastewater -- A Guidance Manual, G. S. Pettygrove and T. Asano, eds., Lewis Publishers, Inc., 1988.
  24. U. S. EPA, Process Design Manual for Land Treatment of Municipal Wastewater, Office of Water Program Operations, EPA 625/1-77-008, 1977.
  25. B. R. Whelan, "Disposal of Septic Tank Effluent in Calcareous Sands", Journal of Environmental Quality, Vol. 17, No. 2, 1988, pp. 272-277.
  26. R. W. Crites, "Micropollutant Removal in Rapid Infiltration", Chapter 20 in Artificial Recharge of Groundwater, Takashi Asano, ed., Butterworth Publishers, 1985, pp. 579-608.
  27. M. T. Hoover, A Amoozegar and D. Weymann, "Performance Assessment of Sand Filter, Low Pressure Pipe Systems in Slowly Permeable Soils of a Triassic Basin", On-Site Wastewater Treatment, Proceedings of the Sixth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 10-91, 1991, pp. 324-337.
  28. L. W. Stewart and R. B. Reneau, Jr., "Shallowly Placed, Low Pressure Distribution System to Treat Domestic Wastewater in Soils with Fluctuating High Water Tables", Journal of Environmental Quality, Vol. 17, No. 3, 1988, pp. 499-504.
  29. J. D. Novak, ed., Land Application of Wastes, Van Nostrand Reinhold Co., 1979.
  30. B. L. Sawhney and D. E. Hill, "Phosphate Sorption Characteristics of Soils Treated with Domestic Waste Water", Journal of Environmental Quality, Vol. 4, No. 3, 1975, pp. 342-346.
  31. B. P. Scherer and D. T. Mitchell, "Individual Household Surface Disposal of Treated Wastewater without Chlorination", On-Site Wastewater Treatment, Proceedings of the Third National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 1-82, 1982, pp. 207-214.
  32. D. K. Sauer and W. C. Boyle, "Intermittent Sand Filtration and Disinfection of Small Wastewater Flows", Home Sewage Treatment, Proceedings of the Second National Home Sewage Treatment Symposium, ASAE Publication 5-77, 1977, pp. 164-174.
  33. C. Hagedorn, E. L. McCoy and T. M. Rahe, "The Potential for Ground Water Contamination from Septic Effluents", Journal of Environmental Quality, Vol. 10, No. 1, 1981, pp. 1-8.
  34. W. T. Frankenberger, Jr., "Fate of Wastewater Constituents in Soil and Groundwater: Pathogens", Chapter 14 in Irrigation with Reclaimed Municipal Wastewater -- A Guidance Manual, G. S. Pettygrove and T. Asano, eds., Lewis Publishers, Inc., 1988.
  35. J. C. Converse, M. E. Kean, E. J. Tyler and J. O. Peterson, "Bacterial and Nutrient Removal in Wisconsin At-Grade On-Site Systems", On-Site Wastewater Treatment, Proceedings of the Sixth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 10-91, 1991, pp. 46-61.
  36. C. R. Mote and J. R. Buchanan, "System Design for Enhanced Wastewater Renovation in Shallow Soils", draft copy of paper submitted for presentation at the Seventh National Symposium on Individual and Small Community Sewage Systems, 1994.
  37. C. S. Duncan, R. B. Reneau, Jr., and C. Hagedorn, "Impact of Effluent Quality and Soil Depth on Renovation of Domestic Wastewater", draft copy of paper submitted for presentation at the Seventh National Symposium on Individual and Small Community Sewage Systems, 1994.
  38. J. Bouma, J. C. Converse, R. J. Otis, W. G. Walker and W. A. Ziebell, "A Mound System for Onsite Disposal of Septic Tank Effluent in Slowly Permeable Soils with Seasonally Perched Water Tables", Journal of Environmental Quality, Vol. 4, No. 3, 1975, pp. 382-388.
  39. D. L. Anderson, R. L. Siegrist and R. J. Otis, Technology Assessment of Intermittent Sand Filters, U. S. EPA, Municipal Environmental Research Laboratory, April 1985.
  40. J. Bouma, "Innovative On-Site Soil Disposal and Treatment Systems for Septic Tank Effluent", Home Sewage Disposal, Proceedings of the National Home Sewage Disposal Symposium, ASAE Publication PROC-175, 1975, pp. 152-162.
  41. M. Y. Corapcioglu and A Haridas, "Transport and Fate of Microorganisms in Porous Media: A Theoretical Investigation", Journal of Hydrology, Vol. 72, 1984, pp. 149-169.
  42. K. M. Green and D. O. Cliver, "Removal of Virus from Septic Tank Effluent", Home Sewage Treatment, Proceedings of the National Home Sewage Disposal Symposium, ASAE Publication PROC-175, 1975, pp. 137-143.
  43. D. L. Anderson, A. L. Lewis and K. M. Sherman, "Human Enterovirus Monitoring at Onsite Sewage Disposal Systems in Florida", On-Site Wastewater Treatment, Proceedings of the Sixth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 10-91, 1991, pp. 94-104.
  44. M. Gross and D. Mitchell, "Biological Virus Removal from Household Septic Tank Effluent", On-Site Wastewater Treatment, Proceedings of the Fourth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 7-85, 1985, pp. 295-304.
  45. Ayres Associates, "Onsite Sewage Disposal System Research in Florida -- Progress Report", Florida Department of Health and Rehabilitative Services, 1989.
  46. M. H. Nellor, R. B. Baird and J. R. Smyth, "Health Aspects of Groundwater Recharge", Chapter 11 in Artificial Recharge of Groundwater, Takashi Asano, ed., Butterworth Publishers, 1985, pp. 283-317.
  47. A. C. Chang and A. L. Page, "Fate of Wastewater Constituents in Soil and Groundwater: Trace Organics", Chapter 15 in Irrigation with Reclaimed Municipal Wastewater -- A Guidance Manual, G. S. Pettygrove and T. Asano, eds., Lewis Publishers, Inc., 1988.
  48. J. T. Wilson, C. G. Enfield, W. J. Dunlap, R. L. Cosby, D. A. Foster and L. B. Baskin, "Transport and Fate of Selected Organic Pollutants in a Sandy Soil", Journal of Environmental Quality, Vol. 10, No. 4, 1981, pp. 501-506.
  49. P. A. Sauer and E. J. Tyler, "Volatile Organic Chemical (VOC) Attenuation in Unsaturated Soil Above and Below an Onsite Wastewater Infiltration System", On-Site Wastewater Treatment, Proceedings of the Sixth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 10-91, 1991, pp. 76-85.
  50. T. J. Bicki and J. M. Lang, "Fate of Pesticides Introduced Into On-Site Sewage Disposal Systems", On-Site Wastewater Treatment, Proceedings of the Sixth National Symposium on Individual and Small Community Sewage Systems, ASAE Publication 10-91, 1991, pp. 86-93.
  51. R. J. Otis, D. L. Anderson and R. A. Apfel, "Onsite Sewage Disposal System Research in Florida: An Evaluation of Current OSDS Practices in Florida", Florida Department of Health and Rehabilitative Services, 1993.
  52. A. C. Chang and A. L. Page, "Soil Deposition of Trace Metals during Groundwater Recharge Using Surface Spreading", Chapter 21 in Artificial Recharge of Groundwater, Takashi Asano, ed., Butterworth Publishers, 1985, pp. 609-626.
  53. A. L. Page and A. C. Chang, "Fate of Wastewater Constituents in Soil and Groundwater: Trace Elements", Chapter 13 in Irrigation with Reclaimed Municipal Wastewater -- A Guidance Manual, G. S. Pettygrove and T. Asano, eds., Lewis Publishers, Inc., 1988.
  54. Soil Conservation Service, "Soil Survey of Door County, Wisconsin", U. S. Department of Agriculture, 1978.
  55. J. Bouma, C. F. M. Belmans and L. W. Dekker, "Water infiltration and redistribution in a silt loam subsoil with vertical worm channels", Journal of the Soil Science Society of America, Vol. 46, 1982, pp. 917-921.
  56. J. C. Converse, E. J. Tyler and S. G. Littman, "Nitrogen and Fecal Coliform Removal in Wisconsin Mound System", draft copy of paper submitted for presentation at the Seventh National Symposium on Individual and Small Community Sewage Systems, 1994.
  57. R. B. Reneau, Jr., C. Hagedorn and W. L. Daniels, "On-Site Wastewater Treatment and Disposal Systems on Reclaimed Mined Lands", draft copy of paper provided to author by Reneau, undated.

 

IMAGES (.gif files in a ZIP archive, available for download)

State of Wisconsin Memo (45K)