CHLORINE DISINFECTION BYPRODUCT MANAGEMENT IN RHODE ISLAND SOURCE WATERS USING LED-BASED UV-C DISINFECTION

High loads of natural organic matter (NOM) in source waters increase levels of toxic disinfection byproducts (DBPs) during treatment, including trihalomethanes (THMs) and haloacetic acids (HAAs), which are formed when NOM is chlorinated. Rates of NOM loading and, by extension, DBP formation potential vary spatially and temporally, and depend on land use within the watersheds. While non-chemical disinfection is typically based on mercury UV lamps, LED-based disinfection systems are being considered as an energy efficient alternative, since they require a fraction of the energy used by mercury lamps. This study explores the efficacy of a novel water treatment process for bacterial removal and DBP management that uses conventional NOM removal processes, and LEDbased UVC and chlorine as primary and secondary disinfectants, respectively. Samples were collected from urban, agricultural, and forested watersheds during the summer and fall of 2018. Results show that LED-UVC with secondary chlorination results in the removal of all bacteria while producing 25% of the THMs and HAAs formed through conventional treatment during summer sampling, regardless of the land use. However, increased lignin-based plant matter during fall from defoliation inhibited conventional NOM removal, increasing turbidity and reducing UV transmittance. Additionally, due to the high concentration of NOM, DBP formation exceeded MCLs during the fall season. Therefore, consideration needs to be given to not only alternative disinfection strategies, but also to more efficient NOM removal processes that will reduce byproduct formation during disinfection and increase UV transmittance.


INTRODUCTION
Global water resources are under severe stress from over-pumping and contamination, while climate change-induced extreme weather events and unpredictable weather patterns will further degrade the quality of existing water resources (Arnell, 1999). Although droughts and floods have received much attention recently, of equal concern is the quality of existing water resources and the means with which to treat them.
Moreover, many drinking water treatment plants were designed with the expectation of stationarity, or consistent water quality parameters, such as natural organic matter (NOM) loading, and turbidity (Miley et al, 2015). High rates of NOM loading are particularly problematic due to their reaction with chlorine during disinfection, forming carcinogenic disinfection byproducts (DBPs), such as trihalomethanes (THMs) and haloacetic acids (HAAs) .
The presence of DBPs is a major challenge in drinking water quality treatment, and as a result, water managers often face the challenge of eliminating harmful pathogens while managing DBP levels . First discovered in 1974, the United States Environmental Protection Agency (UESPA, 2012) currently regulates four trihalomethanes (THM4) and five haloacetic acids (HAA5) at maximum contaminant levels (MCLs) of 80 and 60 g/l, respectively. Although many drinking water reservoirs maintain a relatively consistent level of total NOM, composition may change through land use or seasonally as a result of events, such as defoliation, snowmelt, low precipitation, etc. Vaughn et al., 2017;.
Consequently, treatment processes for NOM removal, such as flocculation can be adversely affected by changes in NOM composition, such as an increase in lignin during the fall season or defoliation events caused by pests Kelly et al, 2018).
Large municipal systems mitigate these risks through best management practices, such as forested buffer zones for source water protection. However, the majority of drinking water systems in the U.S. are classified as "small water systems," and usually do not have the resources to take mitigative steps available to large municipalities; consequently, most EPA water quality violations happen with small systems . To offset these shortfalls, many end-users use point-of-entry/POU treatment systems in their households.
One such strategy is the use of non-chemical disinfectants such as ultraviolet light (UV), which has few known toxic byproducts (Chowdhury et al., 2009). It is effective in inactivating microorganisms, including bacteria such as E.coli and chlorine-resistant protozoa, such as Cryptosporidium and Giardia, by disrupting their RNA or DNA, thereby eliminating their ability to reproduce (Bolton and Cotton, 2008). However, while UV is efficient in inactivating bacteria, there is a risk of bacteria reactivation in the distribution system, necessitating the need for secondary chemical disinfection, such as lower doses of chlorine. The assumption is that a pre-chlorination UV irradiation process results in lowering the necessary chlorine concentration compared to what is used when chlorine is a primary disinfectant . Even in point of use/point of entry scenarios, in which water is held in storage for up to 24 hours after disinfection, bacteria have the potential to regrow in as little as 8 hours after chemical or UV disinfection (Lagntange et al, 2012;Faghihzadeh et al, 2018).
Despite the benefits of UV disinfection, most irradiation systems are mercurybased, causing a number of challenges for water treatment. The fragile lamps pose a health hazard if broken, which occurs frequently during transportation, operation, and replacement (Gray, 2015). They are also prone to biofouling as a result of the high temperatures they generate, requiring more maintenance (Wurtle, 2011). This often requires the use of highly trained staff and capital equipment, which are not always available for small water systems. Moreover, the common low-pressure mercury lamps tend to be monochromatic and are limited to fixed wavelength at 254 nm, whereas maximum efficiency is reached at around 260 nm (Wurtle, 2011). Between 260-270 nm the inactivation of viruses and protozoa is slightly improved (Vilhunen, 2009;Wurtle, 2011;. UV-C LED systems are offered in a wide variety of wavelengths, including the desired 265 nm wavelength. There are also many significant advantages of UV-C LED to mercury UV lamps, including easier disposal (absence of mercury), instant on-off that requires no warm-up time, lower power usage, longer life, efficient transfer of energy into lights, and pulsing, which could potentially increase energy output (Vilhunen et al, 2009;Wertle et al., 2011;. Moreover, UVC-LED efficiency can be further enhanced through improved irradiation system designs, allowing for a wider range of applications than what it is typical available for mercury lamps Wertle et al., 2011).
However, many of these studies only research the efficacy of UV-LED lamps in stand-alone, static experiments, with few studies in the context of a full water treatment system. The goal of this study is to assess the efficacy of UVC-LED as a primary disinfectant for point-of-use treatment of effluent water from conventional small water systems. It will consider seasonal and temporal stresses on small water systems, and the UVC-LED's ability to manage DBP formation while eliminating harmful bacteria.

Introduction
Global water resources are under severe stress from over-pumping and contamination. Moreover, climate change-induced extreme weather events and unpredictable weather patterns will further deplete existing water resources [1]. While water shortages from droughts have received a lot of attention, such as Day Zero in South Africa, of equal concern is the quality of existing water resources and the means with which to treat them. thus poisoning the water supply by causing lead to enter customers' taps [7].
The issue becomes more challenging in developing countries, especially in rural, decentralized communities, which are not connected to larger, urban municipal water supply systems. As a result, more resilient and innovative treatment systems that can cost-effectively address a wide range of contaminants are needed, while requiring relatively minimal maintenance.
This study assessed the efficacy of an innovative treatment system that uses readily available materials for conventional treatment, such as sand and activated carbon for filtration, and an LED-based UVC disinfection system to replace chlorination as a primary disinfectant. In comparison to traditional mercury lamps, UVC-LEDs have many unique features that improve inactivation efficiency, including multiple wavelengths and pulsed illumination [8]. Moreover, UVC-LED efficiency can be further enhanced through improved reactor designs, allowing for a wider range of applications than what it is typical available for mercury lamps [8]. Currently, most UV disinfection uses high or low-pressure mercury lamps. They require special training, need to be replaced frequently, and pose a severe contamination risk if they break [8,9]. On the other hand, LED-based UVC systems require minimum maintenance and have a significantly longer lifespan, making them better suited for small, sustainable treatment systems. Moreover, the challenge of disposing spent mercury lamps as hazardous waste is removed, since LED systems are mercury free [8].

Study Site
A non-urban, forested watershed (Cork Brook) in the northern region of the state of Rhode Island was selected for this study. The Cork Brook is a significant tributary of the Scituate Reservoir, which supplies around 60% of the state's population with drinking water [10].

Bacterial Treatment
Experiments were conducted to determine the efficacy of the UV-LED systems for total coliform and E.coli inactivation. Natural water was collected from a local river and was filtered through a dual anthracite/sand column to remove turbidity. The effluent was then pumped through the UVC-LED system at a flowrate of 12 ml/min. Total coliform and E.coli were analyzed in the source water itself prior to filtration, after filtration, and after irradiation using the IDEXX Colilert-18 method [11].

Bench Scale Experiment
A conventional treatment train modelled after a local water treatment utility formed the basis our experiments, and included flocculation, coagulation, and anthracite/sand filtration. Three benchtop experiments were conducted in parallel: 1. Conventional treatment with higher-dosed chlorine as primary disinfectant (CPD) (2 ± 0.05 mg/l).
3. LED-based, continuous UV as a primary disinfectant with the addition of a granular activated carbon (GAC) filter and lower-dosed chlorine as a secondary disinfectant (GAC + UVPD) (0.5 ± 0.05 mg/l).
The primary differences between the three experiments were in the disinfection procedure, as well as the addition of an activated carbon filter in the third treatment train.
The ferric sulfate flocculent (75 mg/l) and the dual sand/anthracite filter media used in the experiment were sourced from a local water utility. 50 grams of utility-grade anthracite were packed on top of 25 grams of silica sand in a 16-inch acrylic column.
Washed gravel was used to contain the filter media. Through jar tests, the addition of 75 mg/l of ferric sulfate and pH adjustment of 5.6 was determined to be the optimal conditions for removing natural organic matter. After pH adjustment and the addition of ferric sulfate, raw water samples were flocculated at a velocity gradient of 750 sec-1 for 30 minutes. After 30 minutes of settling time, samples were coagulated at a velocity gradient of 90 sec-1 for an additional 30 minutes. The treated water was then pumped through the anthracite/sand filter at a flow rate of 12 ml/minute using Teflon tubing.
Samples were transferred to 950 ml amber jars, in which they were chlorinated with a sodium hypochlorite solution. CPD was dosed at 2 mg/l, while both UVPD and GAC + UVPD were dosed at 0.5 mg/l. The samples were then incubated at a constant temperature for 20°C ± 1°C for 24 hours.

Bacterial Inactivation
Raw river water had initial concentrations of 35 Cfu/100 ml and 5 Cfu/100 ml for total coliforms and E.coli, respectively. After filtration, bacterial concentrations remained mostly unchanged, with total coliforms remaining at 35 Cfu/100 ml and E. coli reduced to 4 Cfu/100 ml. After both conventional treatment and irradiation by UVC-LED, total coliform and E.coli concentrations were reduced <1 Cfu/100 ml, meeting the drinking water standards [14].

Trihalomethane Formation
There was a background concentration of 5.7 µg/l of total trihalomethanes (TTHMs) in the source water, with chloroform being the dominant species. Removal rates for TOC and reduction of UV254 absorbance were similar for the two conventional treatment trains.
TOC decreased by 70%, from an initial concentration of 5.7 mg/l in the raw source water to 1.7 mg/l for both the CPD and UVPD treatment trains. Reduction achieved was below Teflon tubing. Samples were transferred to 950 ml amber jars, in which they were chlorinated with a sodium hypochlorite solution. CPD was dosed at 2 mg/l, while both UVPD and GAC + UVPD were dosed at 0.5 mg/l. The samples were then incubated at a constant temperature for 20°C ± 1°C for 24 hours.
Afterwards, samples were transferred to 40 ml amber vials pretreated with sodium thiosulfate to neutralize chlorine and were sent to the Department of Civil and Environmental Engineering at the University of Massachusetts Amherst for trihalomethane analysis. A modified version of the EPA 551.1 method for analyzing trihalomethanes was used.
Additionally, effluent samples along every step of the treatment train were taken and analyzed for DBP precursors, including non-purgable organic carbon (NPOC) and UV-254 absorbance, which were determined using the combustion oxidation catalytic method and EPA Method 415.3, respectively [10]. NPOC was used instead of total organic carbon/dissolved organic carbon since some samples had levels of inorganic carbon that would interfere with results [13]. SUVA was derived by dividing UV-254 by NPOC.

Bacterial Inactivation
Raw river water had initial concentrations of 35 Cfu/100 ml and 5 Cfu/100 ml for total coliforms and E. coli, respectively. After filtration, bacterial concentrations remained mostly unchanged, with total coliforms remaining at 35 Cfu/100 ml and E. coli reduced to 4 Cfu/100 ml. After both conventional treatment and irradiation by UVC-LED, total coliform and E. coli concentrations were reduced <1 Cfu/100 ml, meeting the drinking water standards [14].

Removal of Total Coliform and E. coli
Total Coliform E. coli 4 the 2 mg/l EPA limit [13]. The addition of the GAC filter to the dual media sand/anthracite filter further reduced levels by 93% to 0.93 mg/l. Although TOC removal rates for CPD and UVPD treatment trains were similar, TTHM production was significantly different. The addition of the higher chlorine dose (2 mg/) in the CPD experiment increased TTHM production to 8.54 µg/l, which was a 33% increase from background levels. On the other hand, the lower chlorine dose (0.5 mg/l) for UVPD increased TTHM to just 6.13 µg/l, or a 7% increase from background levels.
Though TOC removal for UVPD with the addition of a GAC filter was higher than just UVPD, TTHM production was still very similar at 6.05 µg/l or a 5.8% increase from background levels.

Conclusion
All treatment trains were effective at inactivating total coliform and E.coli.
Further studies are needed to investigate a broader spectrum of source water compositions. That is, although TTHM formation in this study is relatively low and near background levels for all three treatment trains, the removal efficiency may become more apparent if raw source water quality is less pristine than in this case study. For example, water bodies near urban or agricultural areas, where organic carbon and other nutrient loading will be significantly higher due to increased anthropogenic activities, it is likely that DBP formation potential will be significantly higher. Therefore, the significance of using UVPD and UVPD with a GAC filter may become more apparent in scenarios where concentrations of TOC and other DBP precursors are higher in the source water.

Introduction
Global water resources are under severe stress from over-pumping and contamination, while climate change-induced extreme weather events and unpredictable weather patterns will further degrade the quality of existing water resources (Arnell, 1999). Although droughts and floods have received much attention recently, of equal concern is the quality of existing water resources and the means with which to treat them.
Moreover, many drinking water treatment plants were designed with the expectation of stationarity, or consistent water quality parameters, such as natural organic matter (NOM) loading, and turbidity (Miley et al, 2015). High rates of NOM loading are particularly problematic due to their reaction with chlorine during disinfection, forming carcinogenic disinfection byproducts (DBPs), such as trihalomethanes (THMs) and haloacetic acids (HAAs) .
The presence of DBPs is a major challenge in drinking water quality treatment, and as a result, water managers often face the challenge of eliminating harmful pathogens while managing DBP levels . First discovered in 1974, the United States Environmental Protection Agency currently regulates four trihalomethanes (THM4) and five haloacetic acids (HAA5) at maximum contaminant levels (MCLs) of 80 and 60 g/l, respectively (USEPA, 2018).
Although many drinking water reservoirs maintain a relatively consistent level of total NOM, composition may change through land use or seasonally as a result of events, such as defoliation, snowmelt, low precipitation, etc. Vaughn et al., 2017;. Consequently, treatment process for NOM removal, such as flocculation can be adversely affected by changes in NOM composition, such as an increase in lignin during the fall season or defoliation events caused by pests Kelly et al, 2018).
Large municipal systems mitigate these risks through best management practices, such as forested buffer zones for source water protection. However, the majority of drinking water systems in the U.S. are classified as small water systems, which is defined by the EPA as a water system serving 10,000 or fewer customers (USEPA, 2016). These systems usually do not have the resources to take mitigative steps available to large municipalities, especially during emergencies. Consequently, most EPA water quality violations happen with small systems . The Federal Emergency Management Agency (FEMA) has detailed recommendations for disinfecting water during emergency situations, such as floods and hurricanes, using household bleach (FEMA, 2017). However, these come with risks, as improper handling and storage of chlorine can reduce its efficacy in treating bacteria . Moreover, there is also the risk of increased DBP formation if too much chlorine is added.
To offset these shortfalls, non-chemical disinfectant point-of-use (POU) treatment systems have been considered as an alternative, especially ultraviolet light (UV) which has few known toxic byproducts (Chowdhury et al., 2009). UV is effective in inactivating microorganisms, including bacteria such as Escherichia coli (E.coli) and chlorineresistant protozoa, such as Cryptosporidium and Giardia, by disrupting their RNA or DNA, thereby eliminating their ability to reproduce (Bolton and Cotton, 2008). However, while UV is efficient in inactivating bacteria, there is a risk of bacterial regrowth in as little as eight hours, especially in POU scenarios where water is not immediately consumed and can be stored for up to 24 hours Fagigzahdeh et al, 2018). This requires secondary disinfection, often in the form of a smaller concentration of chlorine; the assumption is that a pre-chlorination UV irradiation process results in lowering the necessary chlorine concentration compared to what is used when chlorine is a primary disinfectant .
Despite the benefits of UV disinfection, most irradiation systems are mercurybased, causing a number of challenges for water treatment. The fragile lamps pose a health hazard if broken, which occurs frequently during transportation, operation, and replacement . They are also prone to biofouling as a result of the high temperatures they generate, requiring more maintenance (Wurtle, 2011). Consequently, specialized training in lamp operation and replacement is sometimes required, which is not ideal in an emergency situation. Moreover, the common low-pressure mercury lamps However, many of these studies only research the efficacy of UV-C LED lamps in stand-alone, static experiments, with few studies in the context of a POU system. The goal of this study is to assess the efficacy of UV-C LED as a primary disinfectant for point-of-use treatment of effluent water from conventional small water systems during emergencies, where violations are likely to occur. It will consider seasonal and temporal stresses on small water systems and the UV-C LED's ability to manage DBP formation while eliminating harmful bacteria. It is hypothesized that using UV-C LED as a primary disinfectant with a smaller concentration of chlorine will be as effective for bacterial inactivation as chlorine as a primary disinfectant. It is also expected that the UV-C LED treatment train will produce fewer disinfection byproducts overall than chlorine as a primary disinfectant. To test this hypothesis, disinfection byproducts (THMs and HAAs) and bacteria from the treated water using both disinfection methods will be compared.
Temporal and spatial variables will also be factored to test this hypothesis under different scenarios.

Description of Study Sites
Seasonal effects and land use/land cover (LULC) play critical roles in the loading of natural organic matter, including the presence of humic and tannic acids, and other DBP precursors due to a combination of factors including snow melt, rainstorms, and fall foliage (Vaughn et al., 2017). Samples were collected during high-flow events during the summer (June-July-August) and fall (October-November) to factor temporal variations in NOM characteristics. Samples were also collected during storm events to simulate emergency scenarios in which raw source water will have elevated TOC levels, bacteria, turbidity, and other disinfection byproduct precursors ). LULC and watershed management also affects how much storm water runoff enters waterways, thus causing considerable variations in rates of loading, affecting source water quality and treatment processes (Singer et al, 2006). Samples were collected from Cork Brook, Baiely Brook, and the Maidford River ( Figure 1). Cork Brook is an influent stream feeding into the Providence Water Supply's reservoir in Scituate, Rhode Island, characterized by a 13,000-acre, forested buffer zone. Bailey Brook and Maidford River, which are major influent streams in Newport Water Supply's main reservoir in Newport, RI, are characterized as coastal urban and agricultural watersheds, respectively.  (Zhou et al, 2010). Additionally, all three sites are characterized by post-glacial landscapes common in New England .
Samples were collected during storm evens in early August as part of the summer sampling (precipitation > 0.5 inches ) and early November for the fall sampling (average precipitation > 1 inch) (NOAA National Centers for Environmental Information). Fall samples were collected during a period of major leaf-off following the late October peak fall foliage in Rhode Island (Zielinski et al, 2005).

Experimental Method
Benchtop experiments were run to simulate an operational treatment plant largely  the course of two months, which is known be more effective than a non-biofilter in removing microorganisms . Washed gravel was used to contain the filter media. Water was pumped from the flocculator to the filter and UV-C LED system using PTFE tubing (1/16" diameter) and a peristaltic pump (Fisherbrand™Variable-Flow Peristaltic Pump) at a flow rate of 25 ml/minute. Experiments were run at room temperature 23  2C (Vilhunen et al, 2009). Every treatment experiment was repeated three times consecutively. Samples were kept in refrigerated storage at 4C until processing.
All glassware, including for THM and HAA analysis, as well as chlorination, were soaked for 24 hours in a soap bath, rinsed three times with deionized water, and then placed in acid bath (5% sulfuric acid) for an additional 24 hours. Glassware was then rinsed again three times with deionized water and heated for at least 12 hours at 300C. Raw water samples were collected in 7-gallon jerry cans and stored in a refrigerator at 4C. Two samples were then transferred to three 2-liter flocculation jars, which were placed in Lovibond ® ET 750, 6 Station Laboratory Floc Tester. Ferric sulfate (Fe2(SO4)3) was used as a flocculant due to its known effectiveness in separating organic matter (Aguilar, 2003). A total of 50 mg of ferric sulfate was added to each of the 2-liter jars, resulting in a concentration of 25 mg/l. The pH was adjusted to 5.6 for optimal use of the flocculent, using either 1M of reagent grade sodium hydroxide or sulfuric acid, depending on the waters' initial pH (Abdessemed, 2000). Samples were mixed rapidly at a velocity gradient of 750 m/s for 10 minutes and allowed to settle for 30 minutes. Water samples were then flocculated at a velocity gradient of 90 m/s for 30 minutes and settled for at least one hour before filtration. After settling, the water was pumped through a dual media sand/anthracite biofilter using a peristaltic pump at a flowrate of 25 ml/minute. The system was purged with one liter of deionized after each sample run. Samples were not collected post-filtration until the biofilter was purged with an additional 500 ml of sample water that had been flocculated and coagulated.

Disinfection
After filtration, water for the CPD train was transferred directly to a 950 ml amber glass jar for chlorination. For the UVPD and UV-CPD treatment trains, water was pumped through the UV-C LED chamber (residence time of 8 minutes) before being transferred to the amber jars for disinfection. Concentrations of 2 mg/l of chlorine, typical for point-of-use disinfection in non-turbid waters, were added to the CPD and UV-CPD treatment trains . A concentration of 0.5 mg/l of chlorine was added to the UVPD treatment train. At that chlorine concentration, UV irradiation was found effective for inactivating bacteria while limiting DBP formation . A pH of approximately 6 was maintained during disinfection using 1 M of sodium hydroxide or sulfuric acid as needed (Cowman and Singer, 1996). After the addition of chlorine, samples were incubated at a temperature of 20°C +/-1°C for 24 hours, a typical storage time for POU/POE scenarios . All samples were disinfected within three hours of each other. Samples were not collected from the UV-C LED chamber until it was purged with 500 ml of coagulated/flocculated and filtered sample water.
At the end of the 24 hours, samples were collected for E.coli, THM, and HAA analysis. THM and HAA samples were transferred to 40 ml amber volatile organic analysis (VOA) vials. The VOA vials were pretreated with to quench the samples with 3 mg of sodium thiosulfate for THM analysis and 6 mg of ammonium chloride for HAA analysis.

Analytical Procedures
Natural organic matter concentrations, including total organic carbon and dissolved organic carbon, were determined using the combustion catalytic oxidation method (Shimadzu TOC-L Analyzer). Raw water samples were tested for dissolved organic carbon (DOC), and were filtered using a 0.45 m filter prior to analysis. Due to instrument limitations, total organic carbon (TOC) was not measured for raw water samples. However, TOC measurement were taken for samples after pre-disinfection treatment (coagulation, flocculation, and filtration). Samples for UV254 absorbance were also filtered with a 45 m filter and were determined using EPA Method 415.3 (Shimadzu UV-2600 UV-Vis Spectrophotometer). Additionally, raw water samples were scanned from 600 nm to 190 nm to identify additional peaks and changes in intensity. where y = percent aromaticity and x = SUVA254 (Appendix E). The model was determined using 13 C NMR measurements of water samples collected from diverse surface water environments and correlated with their respective SUVA254 values; a strong correlation (R 2 = 0.97) was found between high SUVA254 high 13 C NMR values, which indicate aromatic content . However, it must be noted that while the correlation is significant, percent aromaticity can only be approximated using this model.  (2001) Nine standard HAA analytes containing chlorine and bromine were measured using the modified EPA Method 552.2. They were classified by the type of halogen substitutions (bromine and chlorine) and number of halogen substitutions (mono-, di-or tri-haloacetic acids). These classifications are important for understanding the type of DBPs as well as their unique formation pathways due to their different formation pathways when reacting with chlorine . Additionally, four standard THM analytes were measured using EPA Method 551.1 and were also classified as chlorinated or brominated.    As summarized in Table 6, turbidity was also much higher in summer than the fall, with an average of 25.08 NTUs compared to 3.15 NTUs for the summer. E. coli levels increased by 177% from summer to fall for Cork Brook, 604% for Maidford River, and 392% for Bailey Brook. Additionally, the UV-VIS spectra of raw water samples in the 600 nm -190 nm range were similar for all sites and seasons (Appendix C).

Raw Water Characteristics
Moreover, the highest absorbance for each sample was from the 400 nm -200 nm range, suggesting that natural organic matter was primarily composed of humic acids . The similar spectra do not suggest that NOM composition was similar for all the samples. However, the varying areas under the spectra suggest variable concentrations of NOM in each sample . This is consistent with the DOC measurements for each sample.

Treatment: TOC Removal
All samples were pretreated for TOC removal before disinfection using the same conventional techniques of flocculation/coagulation with ferric sulfate at pH 5.6 and filtration with dual anthracite and sand media. Due to instrument limitations, TOC was not measured for raw water samples. Since TOC is the sum of DOC and particulate organic carbon (POC), the true value of raw water TOC is expected to be higher than reported DOC values. Therefore, TOC reductions from raw water to post-filtered effluent are conservative. Treatment varied by summer and fall seasons. Overall, treated summer samples achieved EPA drinking water standards for turbidity TOC, and E. coli, while treated fall samples failed to achieve the required standards.  Two-way ANOVA analysis tested for significance of between treatment and seasonal and temporal variation. Spatial variation across the three watersheds (Maidford River, Cork Brook, and Bailey Brook) was found to not be significant (p > 0.05), while temporal variation (summer and fall) was found to be significantly related with treatment efficacy (p < 0.05). As a result, treatment data for all three sites were grouped together and categorized by season, which was found to be the significant variable.
For summer samples, influent TOC for at all three sites ranged from 19.99 mg/l to 34.23 mg/l (Fig. 4). After conventional treatment (flocculation plus sand/anthracite filtration), TOC concentrations were reduced to below the EPA limit of 2 mg/l required before disinfection (USEPA, 2017) for all three sites. The addition of UV treatment had no effect on either TOC reduction or turbidity removal. Influent turbidity for the three sites ranged from 2.02 NTUs to 4.17 NTUs. Final turbidities were below the EPA limit of 1 NTU (USEPA, 2017).   Though large amounts of bacteria were removed, a full log reduction was not achieved for any of the summer and fall samples through the coagulation/flocculation and filtration process alone. Disinfection with UV or chlorine, or a combination of the two, was required to achieve the required log inactivation (Table 7).

Treatment: Bacteria Disinfection
During the summer sampling, UVC irradiation without the addition 0.5 mg/l of chlorine was sufficient to inactivate E. coli post-filtration to the required EPA standard of <1 MPN/100 ml. A one log-reduction was achieved for Cork Brook and Maidford River samples, and two log-reduction was achieved for Bailey Brook with UV-C LED treatment. The use of chlorine as a primary disinfectant was also sufficient for achieving the required reductions. During the fall, only the CPD and UV-CPD treatment trains, which used 2 mg/l of chlorine, achieved the required disinfection standards. UVPD with a chlorine dose of 0.5 mg/l was only able to achieve <1 MPN/100 ml limit for the Cork Brook sample, in which influent bacteria was an order of magnitude lower.
For fall E.coli results, UVC irradiation of post-filtration samples removed between 48% (Maidford River) and 77% (Bailey Brook). Unlike the summer sample, no log reductions were achieved and the residual E. coli levels were still far above required EPA standards e.g. up to 487.2 MPN/100 ml for Maidford River (Table 7).   Average HAA concentrations during the summer did not exceed the EPA MCL of 60 g/l for all treatment trains for all three sites. UVPD formed 13.6 g/l, CPD formed 24.2 g/l, and UV-CPD formed 21.7 g/l. HAA formation during the fall was much higher for the CPD and UV-CPD treatment trains, with average HAA concentrations at 61.2 (153% increase) and 62.4 g/l (187% increase), respectively. Fall UVPD HAA formation was similar to summer UVPD formation, and increased by only 1.7 g/l to 15.3 g/l.

Average DBP Formation
There was a statistically significant relationship between average HAA formation and season (p < 0.05), but not site. This can be seen in Figure 1, which displays HAA concentrations with respect to SUVA254. The major differences in HAA formation are reflected in seasonal differences, represented by the two clusters of SUVA254 values (i.e. low SUVA254 values for summer and high SUVA254 for the fall). It is also reflected in the percent increases of HAA for the treatment trains where a nearly three-fold increase from summer to fall for the CPD and UV-CPD treatment trains was observed. This resulted in samples either reaching or exceeding the HAA MCL set by the EPA at 60 g/l. There was no statistically significant relationship between average THM formation and season or site (p>0.05) This can be seen in Figure 2, which display THM concentrations with respect to SUVA254. Similar to the HAA graph, THM formation formed seasonal clusters characterized by SUVA254. However, unlike HAAs, summer THMs exceed fall samples by no more than 10 g/l and are far below the 80 g/l EPA MCL.
Total HAAs and THMs were further categorized by brominated and chlorinated compounds and, in the case of HAA, into di-haloacetic acids (DHAA), and tri-haloacetic acids (THAA).  The distribution changed significantly during the fall, during which non-brominated compounds, such as chloroform, were the most prevalent.   Since samples were collected during a period of major leaf-off following the late October peak fall foliage in Rhode Island, plant detritus accumulated on the topsoil surface surrounding the rivers for the three sites (Zielinski et al, 2005). It is likely that overland flow at the time of sampling mobilized the plant detritus that accumulated on the surface into the streams, leading to inputs of plant material. As a result, many of the compounds that form plant matter, including lignin, a complex, hydrophobic polymer and the second most abundant compound in terrestrial plants after cellulose, may have entered the waterways of the three sampling sites, affecting NOM composition . This is supported by the high raw water SUVA values, which indicate the presence of hydrophobic natural organic matter rich in aromatic content common in lignin (Weishar et al, 2003;. These observations are consistent with previous studies on terrestrial DOC inputs during late fall, which show that runoff mobilizes aromatic, lignin-rich plant detritus that accumulated on the surface from leaf-off, which contributes high terrestrial DOC inputs into rivers Stepczuck et al., 1998;. In another study on the McKenzie River in Oregon, which has very similar seasonal trends to New England,  found that major fall storm events elevate raw water TOC and contributes significant DOM with high DBP precursor content into rivers. Therefore, while summer and fall samples concentrations for DOC were similar, the significant difference in aromatic content (p < 0.05) may have been due to the input of terrestrial plant detritus from the leaf-off during the fall.
In contrast, the low SUVA254 values during the summer were consistent with low aromatic content in soils common in the glacial till prevalent in the post-glacial landscape of Rhode Island . This suggests that plant detritus was not present in significant quantities on the surface, and therefore did not contribute major DOC inputs as they did in the fall.
Furthermore, there was a significant relationship (p < 0.05) between season and proxies used for NOM (i.e. DOC, SUVA254), while land use differences by watershed was found to not be significant (p > 0.5) in determining DOC concentrations during both seasons. This is consistent with previous studies, which showed that overland flow during storms mobilizes soil surface layers, regardless of the type of watershed, and increases the input of terrestrial, lignin-rich aromatic substances into streams (Viden et al, 2008;Mcknight et al., 2001;. It is likely that the observed seasonal changes in TOC composition were a contributing factor for water treatment trains' performance. Lignin, which was likely present in the water during the fall, is known to inhibit the treatment of even advanced industrial wastewater effluent from pulp mills and other agricultural products . Inorganic salts, such as ferric sulfate, that are commonly used in drinking water treatment were particularly ineffective in removing turbidity, color, and other aromatic content, leaving effluent water lignin-rich with its characteristic brown color . Though analyzing the exact composition of the NOM and identifying the presence/absence of compounds such as lignin was beyond the scope of the study, ANOVA showed a significant relationship (p < 0.05) between treatment efficiency (i.e. TOC removal) and season. This is a likely explanation for the varying treatment efficiencies between summer and fall.
Samples collected during the summer were more efficiently processed with conventional treatment. While DOC levels were similar to fall samples, turbidity, UV 254 , and SUVA254 were much lower. As a result, key water treatment parameters were at or below EPA guidelines for the summer collection after treatment. Although the Cork Brook forested watershed had anomalously high DOC at 35 mg/l, double the 17 mg/l concentration detected during the fall, final TOC was reduced to below the 2 mg/l upper limit set by the EPA.  (Sommer et al, 1988) Bacteria inactivation was also impacted by season. This is most likely due to turbidity and TOC removal efficiencies, which also varied by season. It is known that suspended particles in unfiltered water or poorly treated water, including dissolved organics/inorganics, affects UV disinfection in two ways: (i) physically shielding bacteria from UV light, and (ii) scattering, blocking or absorbing UV light, interfering with its ability to be absorbed into microorganisms to achieve inactivation Christensen et al., 2003). For instance, humic acids tend to coat the bacteria, reducing sensitivity of cells to UV light Vilhunen, 2009). As a result, the EPA Enhanced Surface Water Treatment Rule specifies that filter effluent of conventional treatment must not exceed 1 NTU (USEPA, 2006) prior to UV disinfection.

Bacterial Inactivation
That is, if turbidity is below the limit, interference with UV disinfection is minimal (Christensen et al., 2003). This helps explain why summer samples, which had turbidities below 0.3 NTUs pre-disinfection, were effectively irradiated, achieving required bacterial count of less than 1 MPN/100 ml. This is in contrast to the fall samples. The presence of particles and suspended solids in the fall treatment, characterized by the high turbidity and TOC concentrations, very likely inhibited UV disinfection.
The presence of suspended solids also explains why UV dose was different, even though power density and UV irradiance were identical in both the summer and fall experiments. UV dose, as defined by equation 2 (Appendix E), is a function of UV irradiance and time .
UV dosage (mW·s/cm 2 or mJ/cm 2 ) = UV irradiance (mW/cm 2 ) x time (sec) The presence of suspended solids likely increased the time it took for UV irradiance to reach the bacteria. Since both power density and residence time for water samples flowing through the UV-C LED irradiation chamber (8 minutes) was identical for water samples in both the summer and fall, this resulted in a smaller dose for the fall sample relative to the summer sample. This is further supported by UV dose extrapolations from , which correlates UV dose as a function of log inactivation (Equation 1) by quantifying the minimum dose required to achieve a target log inactivation under standard irradiation conditions. A UV dose of at least 4.4 mJ/cm 2 , the minimum required for a one-log inactivation was achieved for summer Cork Brook and Maidford River samples. For Bailey summer samples, a dose of at least 6.2 mJ/cm 2 , the minimum required for a threelog inactivation. However, it is likely that the true dose was much closer to 60 mJ/cm 2 , which is the UV-C LED unit's irradiation performance under low-turbidity conditions.
Only partial log inactivation was achieved for the fall sample with UV irradiation alone, thus the dose was below the 4.4 mJ/cm 2 minimum required for one log inactivtion.
The fall sample results are consistent with a study by Nelson et al. (2013) found that minimal reductions in bacteria concentration were achieved in wastewater treatment effluent with elevated turbidity (> 20 NTUs), even after irradiation for 20 and 40 minutes.
The effect of turbidity and TOC on UVC irradiation can be also illustrated by comparing the disinfection efficiencies by site. Bailey Brook, which had lower turbidity and TOC prior to disinfection in both seasons when compared to Cork Brook and Maidford River, achieved a 2 log-reduction in the summer sample when compared to the 1-log reduction for Cork Brook and Maidford River. Although no log reduction was achieved for any of the sites during the fall, treated Bailey Brook samples achieved a 77% E. coli reduction when compared to 69% for Cork Brook and 48% for Maidford River.
The results herein show that although significant improvements have been made in UV technologies, UV-C LED and UV lamps are largely ineffective in highly turbid waters, hence the need for a pretreatment filter. Inactivation depends on proper removal of TOC and other materials that will scatter UV light. Granular activated carbon (GAC) filters are known to be effective in reducing turbidity and total organic carbon . Therefore, if small water systems violate turbidity guidelines during an emergency, GAC filtration may reduce turbidity for effective POU UV disinfection.

DBP Formation
DBP formation, including both total concentrations and relative distribution of individual compounds, were also significantly related to season (p < 0.05). Similar to other treatment results (TOC, turbidity, and bacteria), it is likely that the seasonal changes in NOM composition affected the DBP formation. However, pathways for THM and HAA formation are very complex and cannot be explained by NOM composition alone.
They depend on a wide range of factors, including chlorine dose, pH, temperature, reaction time with chlorine, and presence of inorganic compounds, such as bromide Hong et al., 2007;Mukandan 2014). The results of this study reflect these complexities, since a combination of these factors affected the presence and quantity of different THM and HAA compounds in the summer and fall.  ANOVA analyses showed that treatment method was significantly related to total DBP (p < 0.05). Lower DBP formation was strongly related with lower chlorine dose, while higher chlorine resulted in higher DBPs. Since treatment was function of chlorine disinfection and UV irradiation, further ANOVA analyses were performed on these methods separately with respect to DBP formation. Chlorine dose was significant (p < 0.05), while UV had no significant relationship with DBP formation (p > 0.05). This suggests that chlorine disinfection alone, not combined chlorination and UV treatment, was primarily responsible for DBP outputs.
The relationship between spatial and temporal variation and DBP formation was more complex. Total HAA and THM formation was not significantly related to land use and land cover differences. However, when taking into account the relative distribution of compounds that constitute total HAAs and total THMs, spatial variation becomes significant (p < 0.05). This is likely due to the higher concentration The spatial variation of brominated species may have also been related to treatment. The formation of brominated DBPs was affected by the type of disinfection method and chlorine concentration. It is known that low chlorine doses (0.5 mg/l to 1 mg/l) are optimal for bromide oxidation and brominated DBP formation . However, as chlorine concentration increases, more chlorine is available for chlorinated DBP formation, decreasing brominated DBP production . This expected outcome was also observed in this study (i.e., the highest level of brominated DBPs was detected in the UVPD treatment train, which used the lowest chlorine concentration of 0.5 mg/l). On the other hand, more non-brominated and chloro-brominated DBPs were dominant during CPD and UV-CPD treatment, which had a higher chlorine concentration of 2 mg/l. These results are important, since brominated DBPs are more toxic than non-brominated species .
NOM composition by season may also explain the significant (p < 0.05) relationship between season and total THMs and HAAs, as well as the relative distribution of individual compounds that constitute them. Empirical data has shown that hydrophobic NOM, characterized by higher SUVA254 values, is more important for HAA formation, while, hydrophilic NOM is more important for THMs and DHAA Ozdemir, 2014). This is consistent with the data in this study.
Chlorination of the treated fall samples, which had hydrophobic, high SUVA254 NOM, resulted in higher HAAs overall . On the other hand, the summer, which was by hydrophilic, low SUVA254 NOM, more THMs were formed than in the fall.
In addition to the seasonal and spatial effects on DBP formation, the experimental parameters of this study, especially pH and chlorine dose, had an THM and HAA formation pathways.  found that low pH favors THAA but suppresses THMs, while high pH favors THMs and DHAAs. The relatively low pH of 6 used in this experiment also explains low concentrations of total THMs. While it is true that the hydrophobic NOM observed in the fall favors certain groups of HAAs, it is likely that the low pH also contributed to the low THM formation, even though there were more precursors available in the fall in the form of high TOC . Twoway ANOVA analyses confirmed this, as there was no significance between THM formation and site and season (p > 0.05). Therefore, it is very likely that changing a parameter such as pH will have major effects on the results. For example, if pH was increased to 10, THM formation will increase while HAA formation will decrease overall .
Chlorine dose may also have had an effect on DBP formation pathways, especially for the fall samples. Overall, DHAA was dominant at an average of 55-67% for all three sites, while THAA only accounted for 20-30% of total HAAs. THMs decreased from summer to fall. This is at odds with a study by Hua et al (2015), which has shown that the combination of low pH and hydrophobic matter in the fall should have resulted in THAA > DHAA > THM. However, the pathways by Hua et al (2015) and other studies were determined through DBP formation potential tests which used a high chlorine dose (20 mg/l), in excess of a sample's chlorine demand to determine maximum DBP formation potential . In conventional water treatment systems, < 10 mg/l of chlorine is more likely to be used .
Moreover, lower chlorine concentrations (< 3 mg/l) favor DHAA formation over THAA formation, since NOM precursors that form DHAAs are more reactive at low concentrations, whereas more free chlorine is needed for oxidation or substitution reactions for precursors to form THAA .
Chlorine concentrations for this study did not exceed 2 mg/l, explaining why DHAA was the dominant HAA species in conditions that would have otherwise favored THAA.
Therefore, these results express the complexities of disinfection and DBP formation. In the event that small water systems are unable to effectively treat water during emergencies, point-of-use chlorination may be a challenge for the typical enduser. DBP formation pathways are very complex, especially in the presence of more precursors, as was seen in the fall scenario. Changing pH or chlorine concentration may not have the desired effect in these cases, since they may suppress one type of DBPs while increasing the formation of others. Moreover, if too little chlorine is used in the presence of so many precursors, those precursors may compete for chlorine demand, making less chlorine available to treat bacteria .

Conclusion
The results of this study suggest that there are advantages to using LED-based UV for point-of-use disinfection. The efficiency of such systems depends on the efficacy of water treatment from the distributor. The summer experiment showed that when water quality parameters, such as turbidity and TOC, are within EPA limits, UV-C LED in combination a low dose of chlorine is sufficient for bacterial inactivation for up to 24 hours, while producing fewer DBPs than chlorine when it is used as a primary disinfectant.
However, there are also major disadvantages to relying on UV-C LED alone, since UV transmissivity and inactivation efficacy depend on effective turbidity and TOC removal. The results of this study show that conventional NOM treatment is vulnerable to failure, especially during seasonal changes to water chemistry. During the fall experiment, conventional treatment process failed to remove TOC efficiently for all three watersheds, irrespective of LULC and best management practices such as forested buffer zones. This led to adverse consequences for the disinfection trains tested in this study.
UV disinfection was inhibited due to reduced UV transmittance from the turbid waters.
On the other hand, when chlorine was used as a primary disinfectant, HAA formation reached or exceeded the EPA MCL of 60 g/l due to the availability of more TOC to react with chlorine. While the higher chlorine concentration in the conventional treatment train managed to inactivate all bacteria, it did so at the cost of exceeding the EPA MCL for HAAs. Nevertheless, despite chlorine's effective disinfection, the water in the fall samples failed to meet other drinking water criteria, such as thresholds for turbidity and TOC. Consequently, it is unlikely that highly turbid water, which would prevent effective UV disinfection, will be suitable for consumption.
Moreover, given the complexities of DBP formation, which include everything from NOM composition to coastal proximity, many end-users will likely face difficulty chlorinating their water effectively in the event of an emergency. Though the results of this study are specific to New England environments, similar challenges may occur in other geographic regions if small water systems are unable to effectively treat their water during emergencies.
To conclude, UV is recommended for use as a primary disinfectant with a smaller concentration of chlorine as a secondary disinfectant. Doing so will reduce the amount of chlorine contact with potentially harmful precursors. However, caution must be exercised, and the use of filters prior to disinfection, such as GAC, are highly recommended. Future studies will assess the practically of using UV with other filters under highly turbid conditions.
Appendix A: List of Abbreviations