Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brent L. Finley is active.

Publication


Featured researches published by Brent L. Finley.


Journal of Toxicology and Environmental Health | 2002

Is hexavalent chromium carcinogenic via ingestion? A weight-of-evidence review.

Deborah M. Proctor; Joanne M. Otani; Brent L. Finley; Dennis J. Paustenbach; Judith A. Bland; Ned A. Speizer; Edward V. Sargent

Hexavalent chromium [Cr(VI)] is recognized as a human carcinogen via inhalation, based on elevated rates of lung cancer among occupationally exposed workers in certain industries. Cr(VI) is also genotoxic in bacterial and mammalian cell lines. In contrast, scientific panels in the United States and abroad have reviewed the weight of evidence (WOE) and decided that the available data are insufficient to conclude that Cr(VI) is an oral carcinogen. A criterion of 0.2 ppb was established by a California agency for Cr(VI) in drinking water to prevent cancer, however, this criterion was withdrawn in November, 2001. This criterion was remarkably lower than the promulgated California and federal drinking-water standards for total chromium of 50 ppb and 100 ppb, respectively. Both of the promulgated standards are designed to be protective of humans who ingest Cr(VI). This article describes a WOE analysis to examine the likelihood that Cr(VI) in drinking water poses a cancer hazard at the current U.S. drinking-water standard. The results indicate that: (1) From the historical epidemiological studies, there are a few reports of increased rates of digestive system cancer among Cr(VI)-exposed workers, although most are not statistically significant; (2) the preponderance of evidence from recent epidemiological studies of Cr(VI)-exposed workers does not support an increased risk of cancer outside of the respiratory system; (3) studies of four environmentally exposed populations are negative; (4) there is only one lifetime animal feeding study, and the findings from that study are considered to be flawed and inconclusive; and (5) recent kinetics and in vivo genotoxicity data demonstrate that Cr(VI) is reduced to nontoxic Cr(III) in saliva, in the acidic conditions of the stomach, and in blood. In short, at concentrations at least as high as the current U.S. maximum contaminant level (100 ppb), and probably at least an order of magnitude higher, Cr(VI) is reduced to Cr(III) prior to or upon systemic absorption. The weight of scientific evidence supports that Cr(VI) is not carcinogenic in humans via the oral route of exposure at permissible drinking-water concentrations.


Toxicology and Applied Pharmacology | 1996

Absorption and elimination of trivalent and hexavalent chromium in humans following ingestion of a bolus dose in drinking water

Brent D. Kerger; Dennis J. Paustenbach; G.E. Corbett; Brent L. Finley

These studies investigate the magnitude and valence state of chromium absorbed following plausible drinking water exposures to chromium(VI). Four adult male volunteers ingested a single dose of 5 mg Cr (in 0.5 liters deionized water) in three choromium mixtures: (1) Cr(III) chloride (CrCl3), (2) potassium dichromate reduced with orange juice (cr(III)-OJ); and (3) potassium dichromate [Cr(VI)]. Blood and urine chromium levels were followed for 1-3 days prior to and up to 12 days after ingestion. The three mixtures showed quite different pharmacokinetic patterns. CrCl3 was poorly absorbed (estimated 0.13% bioavailability) and rapidly eliminated in urine (excretion half-life, approximately 10 hr), whereas Cr(III)-OJ was absorbed more efficiently (0.60% bioavailability) but more slowly (half-life, approximately 17 hr), and Cr(VI) had the highest bioavailability (6.9%) and the longest half-life (approximately 39 hr). All three chromium mixtures caused temporary elevations in red blood cell (RBC) and plasma chromium concentrations, but the magnitude and duration of elevation showed a clear trend (Cr(VI) > Cr(III)-OJ > CrCl3). The data suggest that nearly all the ingested Cr(VI) was reduced to Cr(III) before entering the bloodstream based on comparison to RBC and plasma chromium patterns in animals exposed to high doses of Cr(VI). These findings support our prior work which suggests that water-soluble organic complexes of Cr(III) formed during the reduction of Cr(VI) in vivo explain the patterns of blood uptake and urinary excretion in humans at drinking water concentrations of 10 mg/liter or less.


Science of The Total Environment | 2010

Physical and chemical characterization of tire-related particles: Comparison of particles generated using different methodologies

Marisa L. Kreider; Julie M. Panko; Britt McAtee; Leonard I. Sweet; Brent L. Finley

The purpose of this study was to characterize the physical and chemical properties of particles generated from the interaction of tires and road surfaces. Morphology, size distribution, and chemical composition were compared between particles generated using different methods, including on-road collection, laboratory generation under simulated driving conditions, and cryogenic breaking of tread rubber. Both on-road collected and laboratory generated particles exhibited the elongated shape typical of tire wear particles, whereas tread particles were more angular. Despite similar morphology for the on-road collected and the laboratory generated particles, the former were smaller on average. It is not clear at this stage if the difference is significant to the physical and chemical behavior of the particles. The chemical composition of the particles differed, with on-road generated particles containing chemical contributions from sources other than tires, such as pavement or particulates generated from other traffic-related sources. Understanding the differences between these particles is essential in apportioning contaminant contributions to the environment between tires, roadways, and other sources, and evaluating the representativeness of toxicity studies using different types of particulate generated.


International Journal of Toxicology | 1997

The Critical Role of House Dust in Understanding the Hazards Posed by Contaminated Soils

Dennis J. Paustenbach; Brent L. Finley; Thomas F. Long

The health risks posed by soil pollutants are generally thought to be due to soilingestion and have often resulted in massive regulatory efforts to remedy such contamination. The contribution of this route to the actual human health hazard has been questioned, however, as soil removal alone seems to have little influence on the body burdens of soil contaminants in exposed individuals. Ongoing research also has repeatedly and substantially reduced the estimates of soilingested daily. Because comparatively little time is spent outdoors by most individuals, exposure to soil brought indoors, present as house dust, is now thought to be nearly as important as the directingestion of soil. Exposure via house dust has not been studied specifically, but several observations suggest that it may be important. Dust is largely composed of fine particles of tracked-in soil. The smaller dust particles cling to surfaces better than soil, and contaminant concentrations are often higher in house dust. Fine particles are likely to be more bioavailable, and degradation is slower indoors. Contaminants thus may be concentrated and more readily available in the areas most frequented. In some studies, contaminant levels in dust are correlated more closely with body burdens of contaminants than other sources, suggesting that this route should be considered when assessing risks from soil. Until more research addressing exposure to dust is conducted, recommendations for assessing potential health risks from this pathway are provided.


Journal of Toxicology and Environmental Health-part B-critical Reviews | 2007

State-of-the-Science Review: Does Manganese Exposure During Welding Pose a Neurological Risk?

Annette B. Santamaria; Colleen A. Cushing; James M. Antonini; Brent L. Finley; Fionna Mowat

Recent studies report that exposure to manganese (Mn), an essential component of welding electrodes and some steels, results in neurotoxicity and/or Parkinsons disease (PD) in welders. This “state-of-the-science” review presents a critical analysis of the published studies that were conducted on a variety of Mn-exposed occupational cohorts during the last 100 yr, as well as the regulatory history of Mn and welding fumes. Welders often perform a variety of different tasks with varying degrees of duration and ventilation, and hence, to accurately assess Mn exposures that occurred in occupational settings, some specific information on the historical work patterns of welders is desirable. This review includes a discussion of the types of exposures that occur during the welding process—for which limited information relating airborne Mn levels with specific welding activities exists—and the human health studies evaluating neurological effects in welders and other Mn-exposed cohorts, including miners, millers, and battery workers. Findings and implications of studies specifically conducted to evaluate neurobehavioral effects and the prevalence of PD in welders are also discussed. Existing exposure data indicate that, in general, Mn exposures in welders are less than those associated with the reports of clinical neurotoxicity (e.g., “manganism”) in miners and smelter workers. It was also found that although manganism was observed in highly exposed workers, the scant exposure-response data available for welders do not support a conclusion that welding is associated with clinical neurotoxicity. The available data might support the development of reasonable “worst-case” exposure estimates for most welding activities, and suggest that exposure simulation studies would significantly refine such estimates. Our review ends with a discussion of the data gaps and areas for future research.


Journal of Occupational and Environmental Medicine | 2005

Evaluation of epidemiologic and animal data associating pesticides with Parkinson's disease

Abby A. Li; Pamela J. Mink; Laura J. McIntosh; Teta Mj; Brent L. Finley

Exposure to pesticides may be a risk factor for developing Parkinson’s disease (PD). To evaluate the evidence regarding this association in the scientific literature, we examined both analytic epidemiologic studies of PD cases in which exposure to pesticides was queried directly and whole-animal studies for PD-like effects after systemic pesticide exposure. Epidemiologic studies were considered according to study quality parameters, and results were found to be mixed and without consistent exposure-response or pesticide-specific patterns. These epidemiologic studies were limited by a lack of detailed and validated pesticide exposure assessment. In animal studies, no pesticide has yet demonstrated the selective set of clinical and pathologic signs that characterize human PD, particularly at levels relevant to human populations. We conclude that the animal and epidemiologic data reviewed do not provide sufficient evidence to support a causal association between pesticide exposure and PD.


Critical Reviews in Toxicology | 2013

A review of the health hazards posed by cobalt

Dennis J. Paustenbach; Brooke E. Tvermoes; Kenneth M. Unice; Brent L. Finley; Brent D. Kerger

Abstract Cobalt (Co) is an essential element with ubiquitous dietary exposure and possible incremental exposure due to dietary supplements, occupation and medical devices. Adverse health effects, such as cardiomyopathy and vision or hearing impairment, were reported at peak blood Co concentrations typically over 700 µg/L (8–40 weeks), while reversible hypothyroidism and polycythemia were reported in humans at ∼300 µg/L and higher (≥2 weeks). Lung cancer risks associated with certain inhalation exposures have not been observed following Co ingestion and Co alloy implants. The mode of action for systemic toxicity relates directly to free Co(II) ion interactions with various receptors, ion channels and biomolecules resulting in generally reversible effects. Certain dose–response anomalies for Co toxicity likely relate to rare disease states known to reduce systemic Co(II)-ion binding to blood proteins. Based on the available information, most people with clearly elevated serum Co, like supplement users and hip implant patients, have >90% of Co as albumin-bound, with considerable excess binding capacity to sequester Co(II) ions. This paper reviews the scientific literature regarding the chemistry, pharmacokinetics and systemic toxicology of Co, and the likely role of free Co(II) ions to explain dose–response relationships. Based on currently available data, it might be useful to monitor implant patients for signs of hypothyroidism and polycythemia starting at blood or serum Co concentrations above 100 µg/L. This concentration is derived by applying an uncertainty factor of 3 to the 300 µg/L point of departure and this should adequately account for the fact that persons in the various studies were exposed for less than one year. A higher uncertainty factor could be warranted but Co has a relatively fast elimination, and many of the populations studied were of children and those with kidney problems. Closer follow-up of patients who also exhibit chronic disease states leading to clinically important hypoalbuminemia and/or severe ischemia modified albumin (IMA) elevations should be considered.


Mutation Research\/genetic Toxicology | 1996

Interlaboratory validation of a new assay for DNA-protein crosslinks

Max Costa; Anatoly Zhitkovich; Michael L. Gargas; Dennis J. Paustenbach; Brent L. Finley; Jim R. Kuykendall; Ruth E. Billings; Timothy J. Carlson; Karen E. Wetterhahn; Jian Xu; Steven R. Patierno; Matthew S. Bogdanffy

In 1992, a simple and sensitive assay for detecting DNA-protein crosslinks was developed [1]. In an effort to facilitate the greater use of the assay, a number of studies were conducted to evaluate its reliability and reproducibility. During this work, the assay was used to assess whether various metals and other compounds could induce crosslinks in cultured human lymphocytes (Epstein-Barr virus-transformed Burkitts Lymphoma cell line). Potassium permanganate, mercury chloride, lead nitrate, magnesium perchlorate, aluminum chloride, and cadmium chloride did not induce DNA-protein crosslinks at either cytotoxic or non-cytotoxic levels. Copper sulfate, arsenic trioxide, and potassium chromate induced DNA-protein crosslinks only at cytotoxic concentrations. Acute lethality of the cells was assessed immediately after exposure to metals by trypan blue exclusion while long-term lethality was assessed by cell proliferation and trypan blue exclusion following an incubation period of 5 days after exposure to the metal compound. All metals exhibited more toxicity in the long-term lethality assay compared to the short-term assay. The cultured human lymphocytes treated with various doses of lead acetate, cadmium chloride, arsenic trioxide and copper sulfate, as well as cis-platinum and chromate, were sent to four different laboratories to compare the reliability and reproducibility of the DNA-protein crosslink assay. Depending on the chemical studied, there were quantitative differences in the results observed among the various laboratories using the assay. However, all laboratories generally showed that cis-platinum, chromate, arsenic trioxide and copper sulfate induced DNA-protein crosslinks at levels that produced acute cytotoxicity, whereas cadmium chloride and lead acetate did not.


Journal of Toxicology and Environmental Health | 2003

Human health risk and exposure assessment of chromium (VI) in tap water.

Dennis J. Paustenbach; Brent L. Finley; Fionna Mowat; Brent D. Kerger

Hexavalent chromium [Cr(VI)] has been detected in groundwater across the United States due to industrial and military operations, including plating, painting, cooling-tower water, and chromate production. Because inhalation of Cr(VI) can cause lung cancer in some persons exposed to a sufficient airborne concentration, questions have been raised about the possible hazards associated with exposure to Cr(VI) in tap water via ingestion, inhalation, and dermal contact. Although ingested Cr(VI) is generally known to be converted to Cr(III) in the stomach following ingestion, prior to the mid-1980s a quantitative analysis of the reduction capacity of the human stomach had not been conducted. Thus, risk assessments of the human health hazard posed by contaminated drinking water contained some degree of uncertainty. This article presents the results of nine studies, including seven dose reconstruction or simulation studies involving human volunteers, that quantitatively characterize the absorbed dose of Cr(VI) following contact with tap water via all routes of exposure. The methodology used here illustrates an approach that permits one to understand, within a very narrow range, the possible intake of Cr(VI) and the associated health risks for situations where little is known about historical concentrations of Cr(VI). Using red blood cell uptake and sequestration of chromium as an in vivo metric of Cr(VI) absorption, the primary conclusions of these studies were that: (1) oral exposure to concentrations of Cr(VI) in water up to 10 mg/L (ppm) does not overwhelm the reductive capacity of the stomach and blood, (2) the inhaled dose of Cr(VI) associated with showering at concentrations up to 10 mg/L is so small as to pose a de minimis cancer hazard, and (3) dermal exposures to Cr(VI) in water at concentrations as high as 22 mg/L do not overwhelm the reductive capacity of the skin or blood. Because Cr(VI) in water appears yellow at approximately 1-2 mg/L, the studies represent conditions beyond the worst-case scenario for voluntary human exposure. Based on a physiologically based pharmacokinetic model for chromium derived from published studies, coupled with the dose reconstruction studies presented in this article, the available information clearly indicates that (1) Cr(VI) ingested in tap water at concentrations below 2 mg/L is rapidly reduced to Cr(III), and (2) even trace amounts of Cr(VI) are not systemically circulated. This assessment indicates that exposure to Cr(VI) in tap water via all plausible routes of exposure, at concentrations well in excess of the current U.S. Environmental Protection Agency (EPA) maximum contaminant level of 100 w g/L (ppb), and perhaps those as high as several parts per million, should not pose an acute or chronic health hazard to humans. These conclusions are consistent with those recently reached by a panel of experts convened by the State of California.


Journal of Toxicology and Environmental Health | 1997

Ingestion of chromium(VI) in drinking water by human volunteers: Absorption, distribution, and excretion of single and repeated doses

Brent D. Kerger; Brent L. Finley; Corbett Ge; Dodge Dg; Dennis J. Paustenbach

This study examines the magnitude of hexavalent chromium [Cr(VI)] absorption, distribution, and excretion following oral exposure to 5 and 10 mg Cr(VI)/L in drinking water administered as a single bolus dose (0.5 L swallowed in 2 min) or for 3 d at a dosage of 1 L/d (3 doses of 0.33 L each day, at 6-h intervals). Adult male volunteers ingested deionized water containing various concentrations of potassium chromate, and samples of urine, plasma, and red blood cells (RBCs) were collected and analyzed for total chromium throughout the studies. In the bolus dose studies, a fairly consistent pattern of urinary chromium excretion was observed, with an average half life of about 39 h. However, 4-d total urinary chromium excretion and peak concentrations in urine and blood varied considerably among the 5 volunteers. Studies of repeated exposure to smaller volumes ingested at a more gradual rate (i.e., 0.33 L over 5-15 min) showed similar urinary chromium excretion patterns but generally lower chromium uptake/excretion. Given that sustained elevations in RBC chromium levels provide a specific indication of chromium absorption in the hexavalent state, these data suggest that virtually all (> 99.7%) of the ingested Cr(VI) at 5 and 10 mg Cr(VI)/L was reduced to Cr(III) before entering the blood-stream. The interindividual differences in total chromium uptake and excretion are plausibly explained by ingestion of appreciable doses on an empty stomach, which likely results in the formation of well-absorbed Cr(III) organic complexes in gastrointestinal tissues and possibly the blood. The lack of any clinical indications of toxicity in the volunteers and the patterns of blood uptake and urinary excretion of chromium are consistent with a predominant uptake of Cr(III) organic complexes [derived from Cr(VI)] that are excreted more slowly than inorganic forms of Cr(III). Therefore, it appears that the endogenous reducing agents within the upper gastrointestinal tract and the blood provide sufficient reducing potential to prevent any substantial systemic uptake of Cr(VI) following drinking-water exposures at 5-10 mg Cr(VI)/L. Based on these data, the chemical environment in the gastrointestinal tract and the blood is effective even under relative fasting conditions in reducing Cr(VI) to one or more forms of Cr(III).

Collaboration


Dive into the Brent L. Finley's collaboration.

Top Co-Authors

Avatar

Paul K. Scott

Centers for Disease Control and Prevention

View shared research outputs
Top Co-Authors

Avatar

Brent D. Kerger

Wellington Management Company

View shared research outputs
Top Co-Authors

Avatar

David A. Galbraith

Palo Alto Medical Foundation

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge