June T. Spector
University of Washington
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by June T. Spector.
The American Journal of Medicine | 2010
June T. Spector; Susan R. Kahn; Miranda R. Jones; Monisha Jayakumar; Deepan Dalal; Saman Nazarian
BACKGROUND Observational studies, including recent large cohort studies that were unavailable for prior meta-analysis, have suggested an association between migraine headache and ischemic stroke. We performed an updated meta-analysis to quantitatively summarize the strength of association between migraine and ischemic stroke risk. METHODS We systematically searched electronic databases, including MEDLINE and EMBASE, through February 2009 for studies of human subjects in the English language. Study selection using a priori selection criteria, data extraction, and assessment of study quality were conducted independently by reviewer pairs using standardized forms. RESULTS Twenty-one (60%) of 35 studies met the selection criteria, for a total of 622,381 participants (13 case-control, 8 cohort studies) included in the meta-analysis. The pooled adjusted odds ratio of ischemic stroke comparing migraineurs with nonmigraineurs using a random effects model was 2.30 (95% confidence interval [CI], 1.91-2.76). The pooled adjusted effect estimates for studies that reported relative risks and hazard ratios, respectively, were 2.41 (95% CI, 1.81-3.20) and 1.52 (95% CI, 0.99-2.35). The overall pooled effect estimate was 2.04 (95% CI, 1.72-2.43). Results were robust to sensitivity analyses excluding lower quality studies. CONCLUSIONS Migraine is associated with increased ischemic stroke risk. These findings underscore the importance of identifying high-risk migraineurs with other modifiable stroke risk factors. Future studies of the effect of migraine treatment and modifiable risk factor reduction on stroke risk in migraineurs are warranted.
Journal of Biological Chemistry | 2008
Nathan E. Hellman; June T. Spector; Jonathan Robinson; Xiaofeng Zuo; Sophie Saunier; Corinne Antignac; John W. Tobias; Joshua H. Lipschutz
A classic model of tubulogenesis utilizes Madin-Darby canine kidney (MDCK) cells. MDCK cells form monoclonal cysts in three-dimensional collagen and tubulate in response to hepatocyte growth factor, which activates multiple signaling pathways, including the mitogen-activated protein kinase (MAPK) pathway. It was shown previously that MAPK activation is necessary and sufficient to induce the first stage of tubulogenesis, the partial epithelial to mesenchymal transition (p-EMT), whereas matrix metalloproteinases (MMPs) are necessary for the second redifferentiation stage. To identify specific MMP genes, their regulators, tissue inhibitors of matrix metalloproteinases (TIMPs), and the molecular pathways by which they are activated, we used two distinct MAPK inhibitors and a technique we have termed subtraction pathway microarray analysis. Of the 19 MMPs and 3 TIMPs present on the Canine Genome 2.0 Array, MMP13 and TIMP1 were up-regulated 198- and 169-fold, respectively, via the MAPK pathway. This was confirmed by two-dimensional and three-dimensional real time PCR, as well as in MDCK cells inducible for the MAPK gene Raf. Knockdown of MMP13 using short hairpin RNA prevented progression past the initial phase of p-EMT. Knockdown of TIMP1 prevented normal cystogenesis, although the initial phase of p-EMT did occasionally occur. The MMP13 knockdown phenotype is likely because of decreased collagenase activity, whereas the TIMP1 knockdown phenotype appears due to increased apoptosis. These data suggest a model, which may also be important for development of other branched organs, whereby the MAPK pathway controls both MDCK p-EMT and redifferentiation, in part by activating MMP13 and TIMP1.
The Journal of Pain | 2013
Ardith Z. Doorenbos; Deborah B. Gordon; David Tauben; Jenny Palisoc; Mark Drangsholt; Taryn Lindhorst; Jennifer Danielson; June T. Spector; Ruth Ballweg; Linda Vorvick; John D. Loeser
UNLABELLED To improve U.S. pain education and promote interinstitutional and interprofessional collaborations, the National Institutes of Health Pain Consortium has funded 12 sites to develop Centers of Excellence in Pain Education (CoEPEs). Each site was given the tasks of development, evaluation, integration, and promotion of pain management curriculum resources, including case studies that will be shared nationally. Collaborations among schools of medicine, dentistry, nursing, pharmacy, and others were encouraged. The John D. Loeser CoEPE is unique in that it represents extensive regionalization of health science education, in this case in the region covering the states of Washington, Wyoming, Alaska, Montana, and Idaho. This paper describes a blueprint of pain content and teaching methods across the University of Washingtons 6 health sciences schools and provides recommendations for improvement in pain education at the prelicensure level. The Schools of Dentistry and Physician Assistant provide the highest percentage of total required curriculum hours devoted to pain compared with the Schools of Medicine, Nursing, Pharmacy, and Social Work. The findings confirm the paucity of pain content in health sciences curricula, missing International Association for the Study of Pain curriculum topics, and limited use of innovative teaching methods such as problem-based and team-based learning. PERSPECTIVE Findings confirm the paucity of pain education across the health sciences curriculum in a CoEPE that serves a large region in the United States. The data provide a pain curriculum blueprint that can be used to recommend added pain content in health sciences programs across the country.
Nephrology Dialysis Transplantation | 2011
June T. Spector; Ana Navas-Acien; Jeffrey J. Fadrowski; Eliseo Guallar; Bernard G. Jaar; Virginia M. Weaver
BACKGROUND Low-level lead exposure is widespread and has been implicated as a chronic kidney disease (CKD) risk factor. However, studies evaluating associations of lead dose with newer, potentially more accurate, estimates of kidney function, in participants with a wide range of glomerular filtration rates (GFRs), are scarce. METHODS We compared associations of blood lead and estimated glomerular filtration rate (eGFR) using the Modification of Diet in Renal Disease (MDRD), Chronic Kidney Disease Epidemiology Collaboration (CKD-EPI) and cystatin C single variable, multivariable and combined creatinine/cystatin C equations in 3941 adults who participated in the 1999-2002 National Health and Nutrition Examination Survey cystatin C subsample. RESULTS Geometric mean blood lead was 1.7 μg/dL. After multivariable adjustment, differences [95% confidence interval (CI)] in mean eGFR for a doubling of blood lead were -1.9 (-3.2, -0.7), -1.7 (-3.0, -0.5) and -1.4 (-2.3, -0.5) mL/min/1.73 m(2), using the cystatin C single variable, multivariable and combined creatinine/cystatin C equations, respectively, reflecting lower eGFR with increased blood lead. The corresponding differences (95% CI) were -0.9 (-1.9, 0.02) and -0.9 (-1.8, 0.01) using the creatinine-based MDRD and CKD-EPI equations, respectively. In participants aged ≥60 years, differences in mean eGFR ranged from -3.0 to -4.5 mL/min/1.73 m(2), and odds of reduced eGFR (<60 mL/min/1.73 m(2)) were increased for all estimates of GFR. CONCLUSIONS These results support the inclusion of cystatin C-based eGFR in future lead research and provide additional evidence for environmental lead exposure as a CKD risk factor.
Environmental Research | 2011
Virginia M. Weaver; Nam Soo Kim; Byung Kook Lee; Patrick J. Parsons; June T. Spector; Jeffrey J. Fadrowski; Bernard G. Jaar; Amy J. Steuerwald; Andrew C. Todd; David K. Simon; Brian S. Schwartz
Cadmium is a well-known nephrotoxicant; chronic exposure increases risk for chronic kidney disease. Recently, however, associations between urine cadmium and higher creatinine-based estimated glomerular filtration rate (eGFR) have been reported. Analyses utilizing alternate biomarkers of kidney function allow evaluation of potential mechanisms for these observations. We compared associations of urine cadmium with kidney function measures based on serum cystatin C to those with serum creatinine in 712 lead workers. Mean (standard deviation) molybdenum-corrected urine cadmium, Modification of Diet in Renal Disease (MDRD) eGFR and multi-variable cystatin C eGFR were 1.02 (0.65) μg/g creatinine, and 97.4 (19.2) and 112.0 (17.7) mL/min/1.73 m2, respectively. The eGFR measures were moderately correlated (rs=0.5; p<0.001). After adjustment, ln (urine cadmium) was not associated with serum cystatin-C-based measures. However, higher ln (urine cadmium) was associated with higher creatinine-based eGFRs including the MDRD and an equation incorporating serum cystatin C and creatinine (beta-coefficient=4.1 mL/min/1.73 m2; 95% confidence interval=1.6, 6.6). Urine creatinine was associated with serum creatinine-based but not cystatin-C-based eGFRs. These results support a biomarker-specific, rather than a kidney function, effect underlying the associations observed between higher urine cadmium and creatinine-based kidney function measures. Given the routine use of serum and urine creatinine in kidney and biomarker research, additional research to elucidate the mechanism(s) for these associations is essential.
BMC Public Health | 2013
Michelle Lam; Jennifer Krenz; Pablo Palmández; Maria Negrete; Martha Perla; Helen Murphy-Robinson; June T. Spector
BackgroundHeat-related illness (HRI) is an important cause of non-fatal illness and death in farmworkers. We sought to identify potential barriers to HRI prevention and treatment in Latino farmworkers.MethodsWe conducted three semi-structured focus group discussions with 35 Latino farmworkers in the Central Washington, USA area using participatory rural appraisal techniques. Interviews were audio taped and transcribed in Spanish. Three researchers reviewed and coded transcripts and field notes, and investigator triangulation was used to identify relevant themes and quotes.ResultsAlthough the majority of participants in our study reported never receiving formal HRI training, most participants were aware that extreme heat can cause illness and were able to accurately describe HRI symptoms, risk factors, and certain prevention strategies. Four main observations regarding farmworkers’ HRI-relevant beliefs and attitudes were identified: 1) farmworkers subscribe to varying degrees to the belief that cooling treatments should be avoided after heat exposure, with some believing that such treatments should be avoided after heat exposure, and others encouraging the use of such treatments; 2) the desire to lose weight may be reflected in behaviors that promote increased sweating; 3) highly caffeinated energy drinks are preferred to increase work efficiency and maintain alertness; and 4) the location of drinking water at work (e.g. next to restrooms) and whether water is clean, but not necessarily chemically-treated, are important considerations in deciding whether to drink the water provided at worksites.ConclusionsWe identified potential barriers to HRI prevention and treatment related to hydration, certain HRI treatments, clothing use, and the desire to lose weight among Latino farmworkers. Strategies to address potential barriers to HRI prevention and treatment in this population may include engineering, administrative, and health education and health promotion strategies at individual, workplace, community, and societal levels. Although farmworkers in our study were able to describe HRI risk factors, reported practices were not necessarily consistent with reported knowledge. Further study of potential knowledge-behavior gaps may uncover opportunities for additional HRI prevention strategies. Farmworkers and employers should be included in the development and evaluation of interventions to prevent HRI.
Annals of Occupational Hygiene | 2014
June T. Spector; Perry E. Sheffield
The potential consequences of occupational heat stress in a changing climate on workers, workplaces, and global economies are substantial. Occupational heat stress risk is projected to become particularly high in middle- and low-income tropical and subtropical regions, where optimal controls may not be readily available. This commentary presents occupational heat stress in the context of climate change, reviews its impacts, and reflects on implications for heat stress assessment and control. Future efforts should address limitations of existing heat stress assessment methods and generate economical, practical, and universal approaches that can incorporate data of varying levels of detail, depending on resources. Validation of these methods should be performed in a wider variety of environments, and data should be collected and analyzed centrally for both local and large-scale hazard assessments and to guide heat stress adaptation planning. Heat stress standards should take into account variability in worker acclimatization, other vulnerabilities, and workplace resources. The effectiveness of controls that are feasible and acceptable should be evaluated. Exposure scientists are needed, in collaboration with experts in other areas, to effectively prevent and control occupational heat stress in a changing climate.
Journal of Agromedicine | 2015
June T. Spector; Jennifer Krenz; Kristina N. Blank
ABSTRACT Crop workers are at high risk of heat-related illness (HRI) from internal heat generated by heavy physical work, particularly when laboring in hot and humid conditions. The aim of this study was to identify risk factors for HRI symptoms in Washington crop workers using an audio computer-assisted self-interview (A-CASI) instrument that has undergone reliability and validity evaluation. A cross-sectional A-CASI survey of 97 crop workers in Washington State was conducted during the summer of 2013. Potential HRI risk factors in demographic, training, work, hydration, clothing, health, and environmental domains were selected a priori for evaluation. Mixed-effects logistic regression was used to identify risk factors for self-reported symptoms associated with heat strain and HRI (dizziness/light-headedness or heavy sweating) experienced at work in hot conditions. An increase in age was associated with a lower odds of HRI symptoms (odds ratio [OR] = 0.92; 95% confidence interval [CI] = 0.87–0.98). Piece rate compared with hourly payment (OR = 6.20; 95% CI = 1.11–34.54) and needing to walk for more than 3 minutes to get to the toilet, compared with less than 3 minutes (OR = 4.86; 95% CI = 1.18–20.06), were associated with a higher odds of HRI symptoms. In this descriptive study of risk factors for HRI symptoms in Washington crop workers, decreased age (and less work experience), piece rate pay, and longer distance to the toilet were associated with self-reported HRI symptoms. Modifiable workplace factors should be considered in HRI prevention efforts that are evaluated using objective measures in representative working populations.
Environmental Health Perspectives | 2017
Evan R. Kuras; Molly B. Richardson; Miriam Calkins; Kristie L. Ebi; Jeremy J. Hess; Kristina W. Kintziger; Meredith Jagger; Ariane Middel; Anna A. Scott; June T. Spector; Christopher K. Uejio; Jennifer K. Vanos; Benjamin F. Zaitchik; Julia M. Gohlke; David M. Hondula
Background: Environmental heat exposure is a public health concern. The impacts of environmental heat on mortality and morbidity at the population scale are well documented, but little is known about specific exposures that individuals experience. Objectives: The first objective of this work was to catalyze discussion of the role of personal heat exposure information in research and risk assessment. The second objective was to provide guidance regarding the operationalization of personal heat exposure research methods. Discussion: We define personal heat exposure as realized contact between a person and an indoor or outdoor environment that poses a risk of increases in body core temperature and/or perceived discomfort. Personal heat exposure can be measured directly with wearable monitors or estimated indirectly through the combination of time–activity and meteorological data sets. Complementary information to understand individual-scale drivers of behavior, susceptibility, and health and comfort outcomes can be collected from additional monitors, surveys, interviews, ethnographic approaches, and additional social and health data sets. Personal exposure research can help reveal the extent of exposure misclassification that occurs when individual exposure to heat is estimated using ambient temperature measured at fixed sites and can provide insights for epidemiological risk assessment concerning extreme heat. Conclusions: Personal heat exposure research provides more valid and precise insights into how often people encounter heat conditions and when, where, to whom, and why these encounters occur. Published literature on personal heat exposure is limited to date, but existing studies point to opportunities to inform public health practice regarding extreme heat, particularly where fine-scale precision is needed to reduce health consequences of heat exposure. https://doi.org/10.1289/EHP556
Annals of occupational and environmental medicine | 2014
June T. Spector; Max Lieblich; Stephen Bao; Kevin J. McQuade; Margaret Hughes
ObjectivesExisting methods for practically evaluating musculoskeletal exposures such as posture and repetition in workplace settings have limitations. We aimed to automate the estimation of parameters in the revised United States National Institute for Occupational Safety and Health (NIOSH) lifting equation, a standard manual observational tool used to evaluate back injury risk related to lifting in workplace settings, using depth camera (Microsoft Kinect) and skeleton algorithm technology.MethodsA large dataset (approximately 22,000 frames, derived from six subjects) of simultaneous lifting and other motions recorded in a laboratory setting using the Kinect (Microsoft Corporation, Redmond, Washington, United States) and a standard optical motion capture system (Qualysis, Qualysis Motion Capture Systems, Qualysis AB, Sweden) was assembled. Error-correction regression models were developed to improve the accuracy of NIOSH lifting equation parameters estimated from the Kinect skeleton. Kinect-Qualysis errors were modelled using gradient boosted regression trees with a Huber loss function. Models were trained on data from all but one subject and tested on the excluded subject. Finally, models were tested on three lifting trials performed by subjects not involved in the generation of the model-building dataset.ResultsError-correction appears to produce estimates for NIOSH lifting equation parameters that are more accurate than those derived from the Microsoft Kinect algorithm alone. Our error-correction models substantially decreased the variance of parameter errors. In general, the Kinect underestimated parameters, and modelling reduced this bias, particularly for more biased estimates. Use of the raw Kinect skeleton model tended to result in falsely high safe recommended weight limits of loads, whereas error-corrected models gave more conservative, protective estimates.ConclusionsOur results suggest that it may be possible to produce reasonable estimates of posture and temporal elements of tasks such as task frequency in an automated fashion, although these findings should be confirmed in a larger study. Further work is needed to incorporate force assessments and address workplace feasibility challenges. We anticipate that this approach could ultimately be used to perform large-scale musculoskeletal exposure assessment not only for research but also to provide real-time feedback to workers and employers during work method improvement activities and employee training.