Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where J. A. Hanly is active.

Publication


Featured researches published by J. A. Hanly.


New Zealand Journal of Agricultural Research | 2004

A review of literature on the land treatment of farm‐dairy effluent in New Zealand and its impact on water quality

D. J. Houlbrooke; D. J. Horne; M. J. Hedley; J. A. Hanly; V. O. Snow

Abstract Dairy farming is the largest agricultural industry in New Zealand, contributing 20% of export earnings but providing a challenge for the environmentally acceptable treatment of wastes from dairy farms. Nutrient‐rich farm‐dairy effluent (FDE), which consists of cattle excreta diluted with wash‐down water, is a by‐product of dairy cattle spending time in yards, feed‐pads, and the farm dairy. Traditionally, FDE has been treated in standard two‐pond systems and then discharged into a receiving fresh water stream. Changes brought about primarily due to the Resource Management Act 1991 have meant that most regional councils now prefer dairy farms to land treat their FDE. This allows the water and nutrients applied to land in FDE to be utilised by the soil‐plant system. Research on the effects of land‐treating FDE, and its affects on water quality, has shown that between 2 and 20% of the nitrogen (N) and phosphorus (P) applied in FDE is leached through the soil profile. In all studies, the measured concentration of N and P in drainage water was higher than the ecological limits considered likely to stimulate unwanted aquatic weed growth. Gaps in the current research have been identified with respect to the application of FDE to artificially drained soils, and the lack of research that has taken place with long term application of FDE to land and at appropriate farm scale with realistic rates of application. Whilst the land treatment of FDE represents a huge improvement on the loss of nutrients discharged to fresh water compared with standard two‐pond systems, there is room for improvement in the management of FDE land‐treatment systems. In particular, it is necessary to prevent the direct discharge of partially treated FDE by taking into account soil physical properties and soil moisture status. Scheduling effluent irrigations based on soil moisture deficits results in a considerable decrease in nutrient loss and may result in a zero loss of raw or partially treated effluent due to direct drainage.


New Zealand Journal of Agricultural Research | 2007

Best management practices to mitigate faecal contamination by livestock of New Zealand waters

Rob Collins; Malcolm McLeod; Mike Hedley; A. Donnison; Murray Close; J. A. Hanly; D. J. Horne; C. Ross; Robert J. Davies-Colley; Caroline S. Bagshaw; Lindsay R. Matthews

Abstract This paper summarises findings from the Pathogen Transmission Routes Research Program, describing pathogen pathways from farm animals to water bodies and measures that can reduce or prevent this transfer. Significant faecal contamination arises through the deposition of faeces by grazing animals directly into waterways in New Zealand. Bridging of streams intersected by farm raceways is an appropriate mitigation measure to prevent direct deposition during herd crossings, whilst fencing stream banks will prevent access from pasture into waterways by cattle that are characteristically attracted to water. Riparian buffer strips not only prevent cattle access to waterways, they also entrap microbes from cattle and other animals being washed down‐slope towards the stream in surface runoff. Microbial water quality improvements can be realised by fencing stock from ephemeral streams, wetlands, seeps, and riparian paddocks that are prone to saturation. Soil type is a key factor in the transfer of faecal microbes to waterways. The avoidance of, or a reduction in, grazing and irrigation upon poorly drained soils characterised by high bypass flow and/or the generation of surface runoff, are expected to improve microbial water quality. Dairyshed wastewater should be irrigated onto land only when the water storage capacity of the soil will not be exceeded. This “deferred irrigation” can markedly reduce pollutant transfer to waterways, particularly that via subsurface drains and groundwater. Advanced pond systems provide excellent effluent quality and have particular application where soil type and/or climate are unfavourable for irrigation. Research needs are indicated to reduce faecal contamination of waters by livestock.


Soil Research | 2010

Producing biochars with enhanced surface activity through alkaline pretreatment of feedstocks

K. Hina; P. Bishop; M. Camps; Arbestain A; R. Calvelo-Pereira; J.A. Maciá-Agulló; J.P. Hindmarsh; J. A. Hanly; F. Macías; M. J. Hedley

Surface-activated biochars not only represent a useful carbon sink, but can also act as useful filtering materials to extract plant nutrients (e.g. NH4 + ) from wastes (e.g. animal or municipal waste streams) and added thereafter to soils. Biochars produced by low-temperature pyrolysis of fibrous debarking waste from pine (PI) and eucalyptus (EU) were pre- treated with either diluted (L) or undiluted (S) alkaline tannery waste (L-PI, S-PI, L-EU, S-EU). Biochars produced from untreated feedstock were used as controls. Samples were characterised by FT-IR, solid-state CP MAS 13 C NMR, XPS, SEM microphotographs, and BET specific surface area. Elemental composition, carbon recovery, yield, surface charge, and NH4 + sorption/desorption properties were also studied. Carbon recovery was lower in biochars prepared from L-EU and S-EU (43 and 42%, respectively) than in control EU (45%) but these biochars showed greater changes in their chemical characteristics than those made from L-PI and S-PI, which showed minimal decrease in recovered carbon. The specific surface area of the biochars decreased with treatments, although acidic surface groups increased. In subsequent sorption experiments, treated biochars retained more NH4 + from a 40mg N/L waste stream (e.g. 61% retention in control EU and 83% in S-EU). Desorption was low, especially in treated biochars relative to untreated biochars (0.1-2% v. 14-27%). The results suggest that surface activated biochars can be obtained with negligible impairment to the carbon recovered.


Soil Research | 2008

Land application of farm dairy effluent to a mole and pipe drained soil: implications for nutrient enrichment of winter-spring drainage

D. J. Houlbrooke; David Horne; M. J. Hedley; V. O. Snow; J. A. Hanly

Spray irrigation of farm dairy effluent (FDE) to artificially drained land in accordance with deferred irrigation criteria causes minimal direct drainage of partially treated FDE at the time of irrigation. The influence of deferred irrigation of FDE on the subsequent nutrient enrichment of winter–spring drainage from mole and pipe systems is unknown. Research was conducted in the Manawatu region, New Zealand, to investigate the influence of deferred irrigation of FDE on the quality of water in artificial drainage. The experimental site was established on a Pallic soil (Tokomaru silt loam) at the No. 4 dairy farm at Massey University, Palmerston North. There were 6 plots (each 40 m by 40 m), each with an isolated mole and pipe drainage network. Four of the plots received fertiliser according to the farm’s fertiliser program (non-effluent plots), while the other 2 plots received applications of FDE according to the deferred irrigation scheduling criteria (effluent plots). All of the plots were subject to the farm’s standard grazing management. The average concentrations of N and P in the 2003 winter drainage (average 236 mm) from both the non-effluent and FDE irrigated plots were well above the threshold concentrations that stimulate aquatic weed growth in fresh water bodies. Annual nutrient losses of 31.4 kg N ha/year and 0.65 kg P ha/year in drainage were recorded for non-effluent plots. Deferred irrigation of FDE in the summer period did not increase the loss of N in winter–spring drainage (N loss from effluent plots was 31.1 kg N ha/year) but did cause a significant increase (P < 0.001) in total P in drainage (an additional 1.03 kg P/ha, c. 160% of losses from non-effluent plots, a loss of 3.3% of applied P). Furthermore, an irrigation of FDE to near-saturated soil in mid September resulted in the direct drainage of partially treated effluent, and hence, N and P concentrations in drainage were 6–10-fold greater than those that would normally be expected from drainage events induced by winter–spring rainfall. This illustrates the importance of scheduling FDE irrigation in accordance with deferred irrigation principles.


New Zealand Journal of Agricultural Research | 2014

Field studies assessing the effect of dicyandiamide (DCD) on N transformations, pasture yields, N2O emissions and N-leaching in the Manawatu region

D-G Kim; Donna Giltrap; S. Saggar; J. A. Hanly

Nitrification inhibitors (NI) allow retention of soil nitrogen (N) in the ammonium (NH4+) form for longer periods. Therefore, they can potentially increase pasture yields by decreasing N losses via nitrous oxide (N2O) emissions and nitrate (NO3−) leaching. Multiple field experiments were conducted over 3 years at a Massey University dairy farm in the Manawatu region to determine the effect of the NI dicyandiamide (DCD) on soil N transformations, N2O emissions, pasture yields and NO3− leaching. Over the study period, DCD applied in autumn and winter had a half-life of 12–17 days and persisted in the soils between 42 and 84 days. Application of DCD inhibited the nitrification process, resulting in lower N2O emissions (54%–78% from urine patches). N2O emissions were further reduced using two applications of DCD, but more than two applications had no additional effect. Although the influence of DCD on pasture accumulation or NO3− leaching was not consistent, three applications of DCD increased pasture accumulation by 9% and reduced NO3− leaching by 22% in one of 2 years of the grazed drainage trial. However, the latter was largely influenced by lower drainage water volumes, rather than lower NO3− concentrations.


Soil Research | 2008

Evaluation of tephra for removing phosphorus from dairy farm drainage waters

J. A. Hanly; M. J. Hedley; David Horne

Research was conducted in the Manawatu region, New Zealand, to investigate the ability of Papakai tephra to remove phosphorus (P) from dairy farm mole and pipe drainage waters. The capacity of this tephra to adsorb P was quantified in the laboratory using a series of column experiments and was further evaluated in a field study. In a column experiment, the P adsorption capabilities of 2 particle size factions (0.25–1, 1–2 mm) of Papakai tephra were compared with that of an Allophanic Soil (Patua soil) known to have high P adsorption properties. The experiment used a synthetic P influent solution (12 mg P/L) and a solution residence time in the columns of c. 35 min. By the end of the experiment, the 0.25–1 mm tephra removed an estimated 2.6 mg P/g tephra at an average P removal efficiency of 86%. The 1–2 mm tephra removed 1.6 mg P/g tephra at an average removal efficiency of 58%. In comparison, the Patua soil removed 3.1 mg P/g soil at a P removal efficiency of 86%. Although, the Patua soil was sieved to 1–2 mm, this size range consisted of aggregates of finer particles, which is likely to have contributed to this material having a higher P adsorbing capacity. A field study was established on a Pallic Soil, under grazed dairy pastures, to compare drainage water P concentrations from standard mole and pipe drainage systems (control) and drainage systems incorporating Papakai tephra. The 2 tephra treatments involved filling mole channels with 1–4 mm tephra (Mole-fill treatment) or filling the trench above intercepting drainage pipes with ‘as received’ tephra (Back-fill treatment). Over an entire winter drainage season, the quantity of total P (TP) lost from the control treatment drainage system was 0.30 kg P/ha. The average TP losses for the Mole-fill and the Back-fill treatments were 45% and 47% lower than the control treatment, respectively.


New Zealand Journal of Crop and Horticultural Science | 2004

Green‐manure impacts on nitrogen availability to organic sweetcorn (Zea mays)

J. A. Hanly; P. E. H. Gregg

Abstract Two field experiments were conducted in the Gisborne region of New Zealand to assess the effectiveness of four winter green‐manure crops (lupin, mustard/lupin mix, mustard, and annual ryegrass) for improving the short‐term nitrogen (N) availability of soils growing organic sweetcorn (Zea mays). Average soil mineral (0–15 cm) N in the control treatment (bare soil winter fallow) plots, measured at the time of sweetcorn emergence (late November 1997), was 50.4 kg N ha‐1 at Site A and 81.3 kg N ha–1 at Site B. Compared with the control (bare soil fallow) treatment, soil incorporation of the lupin and mustard/lupin mix treatments significantly increased soil mineral N by 30–45% at both trial sites at sweetcorn emergence. In contrast, the ryegrass treatment reduced soil mineral N levels by 33–43% at both sites. These treatment effects were related to green‐manure crop N concentrations just before soil incorporation. Sweetcorn N accumulation at final harvest was also significantly increased by soil incorporation of the lupin and mustard/lupin mix treatments and significantly reduced by the ryegrass treatment. Average sweetcorn ear yields in the control treatment plots at maturity were 16t ha–1 at Site A and 18 t ha–1 at Site B. However, ryegrass treatment significantly reduced sweetcorn ear yields by 64% at Site A and 48% at Site B, which was likely to have been caused by the lower soil mineral N levels attributed to soil incorporation of the ryegrass treatment. Although the lupin and mustard/lupin mix treatments increased soil N availability and uptake by sweetcorn, these treatments did not significantly improve sweetcorn ear yield. The lack of yield response was attributed to soil moisture limitations, which occurred in the latter part of the season and were likely to have restricted yield potential.


New Zealand Journal of Agricultural Research | 2004

The performance of travelling effluent irrigators: Assessment, modification, and implications for nutrient loss in drainage water

D. J. Houlbrooke; D. J. Horne; M. J. Hedley; J. A. Hanly

Abstract and application of farm‐dairy effluent (FDE), the treatment option preferred by most regional councils, is commonly practised with the use of small travelling irrigators. Field observations indicate that FDE application by rotating irrigators to artificially drained soils can generate drainage contaminated with partially treated FDE, even when the set application depth is less than soil water deficit. The uniformity of FDE application from rotating, modified‐rotating, and oscillating travelling irrigators was determined for a range of application depths and wind conditions. The rotating irrigator produced a bimodal application profile with a two‐ to threefold difference between the highest and lowest application depths. These high application depths are likely to result in drainage of partially treated FDE in late winter and spring when soil moisture deficits are often small. A rotating irrigator was modified with splash plates or an irrigation bar that diverted more FDE to the centre of the application profile. Neither modification improved the uniformity of application. The most uniform application profiles were obtained using a new technology oscillating irrigator. The measured application profiles and a soil water balance were used to simulate the drainage and nutrient loss under each irrigator type. In early spring and late autumn, when soil water deficits were low, the more uniform application profile of the oscillating irrigator, set at its lowest application depth of 10 mm, created less risk of partially treated FDE reaching pipe drains. The simulation model estimated that when operating at a set average application depth of 25 mm the rotating irrigator and oscillating irrigator required soil water deficits of 44 and 32 mm, respectively, to avoid generating drainage. When FDE was applied at 25 mm depth with a deficit of only 18 mm, substantial quantities (30%) of partially treated effluent were estimated to have been drained from the soil no matter which irrigator was used. When application depth equalled the moisture deficit, the more uniform oscillating irrigator had a lower drainage loss (7%) compared with the rotating irrigator (14%). With only a small buffer of 7 mm between soil moisture deficit and application depth it was estimated that the oscillating irrigator achieved zero drainage. When set at their fastest travel speeds, the peak application depths of the rotating and the oscillating irrigators were similar (13 mm) and therefore these irrigators have the same number of operational irrigation days at times when soil water deficits are low.


Communications in Soil Science and Plant Analysis | 2005

An Improved Procedure for Determining Magnesium Fertilizer Dissolution in Field Soils

P. Loganathan; A. D. Mitchell; J. A. Hanly; Tin Maung Aye

Abstract The sequential extraction procedure currently used to measure magnesium (Mg) fertilizer dissolution in soils consists of removing dissolved Mg (step 1), and partially dissolved Mg (step 2), followed by an 18‐h extraction with 2 M HCl at room temperature to determine undissolved Mg (step 3). This procedure is satisfactory for soluble and moderately soluble Mg fertilizers but is not an accurate procedure for slightly soluble fertilizers, such as serpentine. When step 3 is replaced by a digestion procedure using 2 M HCl for 4 h at 90–95°C (improved step 3), the total serpentine Mg recovery (dissolved and undissolved Mg) from soil samples, either immediately after serpentine was added to soil or after a 21‐day incubation with moist soil, was about 100% compared to 40–50% by the original procedure. The improved procedure also increased the recovery of serpentine Mg applied to field soils. Therefore, this study recommends that the third step of the sequential extraction procedure be replaced by a 4 h digestion using 2 M HCl (90–95°C).


New Zealand Journal of Agricultural Research | 2005

Effect of serpentine rock and its acidulated products as magnesium fertilisers for pasture, compared with magnesium oxide and Epsom salts, on a Pumice Soil. 2. Dissolution and estimated leaching loss of fertiliser magnesium

P. Loganathan; J. A. Hanly; L. D. Currie

Abstract The dissolution rate of magnesium (Mg) fertilisers controls their effectiveness in supplying Mg to plants and the potential for fertiliser‐Mg to be lost via leaching. Results from a field trial, conducted on pasture on an Immature Orthic Pumice Soil (pHwater 6.3) treated with different types of Mg fertilisers (100 kg Mg ha–1), showed that Mg dissolution over a 29‐month period differed, being 15–20% for serpentine rock products, 50–98% for acidulated serpentine products, 95% for E‐mag (magnesium oxide), and 98% for Epsom salts. The percentage dissolution of applied fertiliser‐Mg was related to the water solubilities for all the fertilisers except E‐mag, which had a high dissolution rate in soil but a very low solubility in water. However, E‐mag had high Mg solubility in citric acid, consistent with its dissolution rate in soil. Epsom salts, E‐mag, and acidulated serpentine products significantly increased exchangeable Mg in soil samples collected 9 and 29 months after fertiliser application, whereas the unacidulated serpentine rock increased exchangeable Mg only in soil samples collected after 29 months and only when it was re‐applied annually for 3 years. The recovery of fertiliser Mg in pasture herbage was positively related to the Mg dissolution rate over the duration of the trial, being 4–8% for serpentine rock products, 19–22% for acidulated serpentine products, 17% for E‐mag, and 25% for Epsom salts. For all fertilisers, except E‐mag, total recovery of fertiliser Mg in the soil (0–15 cm depth) and herbage combined was lower for fertilisers with the higher rates of Mg dissolution, being 51% for Epsom salts, 53–90% for acidulated serpentine products, 91–95% for serpentine rock products, and 90% for E‐mag. Fertiliser Mg not recovered was assumed to have been leached below the 0–15 cm soil depth (49% for Epsom salts, 10–47% for acidulated serpentine products, 5–9% for serpentine products, and 10% for E‐mag). The very high fertiliser Mg recoveries in soil (0–15 cm depth) and pasture herbage, and consequently low estimated fertiliser Mg leaching losses from the less water‐soluble fertilisers, suggests that these fertilisers have potential for supplying Mg to pasture over a prolonged period if the rate of fertiliser Mg dissolution does not appreciably slow down with time. However, re‐applications of these less soluble Mg fertilisers may be required on a regular basis to ensure that the supply of Mg is adequate for pasture growth and to meet stock requirements.

Collaboration


Dive into the J. A. Hanly's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge