Sean R. Lynch
Eastern Virginia Medical School
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Sean R. Lynch.
The American Journal of Clinical Nutrition | 2009
Mary E. Cogswell; Anne C. Looker; Christine M. Pfeiffer; James D. Cook; David A. Lacher; John L. Beard; Sean R. Lynch; Laurence M. Grummer-Strawn
BACKGROUND A new index to determine body iron promises a simpler approach to monitoring iron deficiency (ID) prevalence. OBJECTIVE Our objective was to compare ID defined as body iron <0 mg/kg and calculated from the log ratio of transferrin receptor to ferritin (the body iron model) to ID defined as >/=2 of 3 abnormal concentrations in ferritin, transferrin saturation, or erythrocyte protoporphyrin (the ferritin model). DESIGN We used measures of iron status and inflammation from 486 children aged 1-2 y, 848 children aged 3-5 y, and 3742 nonpregnant females aged 12-49 y from the National Health and Nutrition Examination Survey 2003-2006. RESULTS ID prevalences (+/-SE) based on the body iron model in children (1-2 and 3-5 y) and in females (12-19 and 20-49 y) were 14.4 +/- 1.9%, 3.7 +/- 0.8%, 9.3 +/- 1.0%, and 9.2 +/- 1.6%, respectively. ID prevalences based on the ferritin model in children (3-5 y) and females (12-19 and 20-49 y) were 4.5 +/- 0.9%, 15.6 +/- 1.2%, and 15.7 +/- 0.8%, respectively. The kappa statistics for agreement between the 2 models were 0.5-0.7. Among females (12-49 y) the positive predictive values of ID based on the body iron model and the ferritin model for identifying anemia were 43 +/- 3% and 30 +/- 2%, respectively, whereas negative predictive values did not differ. C-reactive protein was elevated in 28.8 +/- 3.1% of females with ID by the ferritin model but not by the body iron model and in 0% of persons with ID by the body iron model but not by the ferritin model. CONCLUSIONS The agreement between the 2 indexes was fair to good. Among females, the body iron model produced lower estimates of ID prevalence, better predicted anemia, and appeared to be less affected by inflammation than the ferritin model.
Annals of the New York Academy of Sciences | 1980
Sean R. Lynch; James D. Cook
Food iron is absorbed by the intestinal mucosa from two separate pools of heme and nonheme iron. Heme iron, derived from hemoglobin and myoglobin, is well absorbed and relatively little affected by other foods eaten in the same meal. On the other hand, the absorption of nonheme iron, the major dietary pool, is greatly influenced by meal composition. Ascorbic acid is a powerful enhancer of nonheme iron absorption and can reverse the inhibiting effect of such substances as tea and calcium/phosphate. Its influence may be less pronounced in meals of high iron availability--those containing meat, fish, or poultry. The enhancement of iron absorption from vegetable meals is directly proportional to the quantity of ascorbic acid present. The absorption of soluble inorganic iron added to a meal increases in parallel with the absorption of nonheme iron, but ascorbic acid has a much smaller effect on insoluble iron compounds, such as ferric oxide or ferric hydroxide, which are common food contaminants. Ascorbic acid facilitates iron absorption by forming a chelate with ferric iron at acid pH that remains soluble at the alkaline pH of the duodenum. High cost and instability during food storage are the major obstacles to using ascorbic acid in programs designed to combat nutritional iron deficiency anemia.
Food and Nutrition Bulletin | 2010
Richard F. Hurrell; Peter Ranum; Saskia de Pee; Ralf Biebinger; Lena Hulthen; Quentin Johnson; Sean R. Lynch
Background Iron fortification of wheat flour is widely used as a strategy to combat iron deficiency. Objective To review recent efficacy studies and update the guidelines for the iron fortification of wheat flour. Methods Efficacy studies with a variety of iron-fortified foods were reviewed to determine the minimum daily amounts of additional iron that have been shown to meaningfully improve iron status in children, adolescents, and women of reproductive age. Recommendations were computed by determining the fortification levels needed to provide these additional quantities of iron each day in three different wheat flour consumption patterns. Current wheat flour iron fortification programs in 78 countries were evaluated. Results When average daily consumption of low-extraction (≤ 0.8% ash) wheat flour is 150 to 300 g, it is recommended to add 20 ppm iron as NaFeEDTA, or 30 ppm as dried ferrous sulfate or ferrous fumarate. If sensory changes or cost limits the use of these compounds, electrolytic iron at 60 ppm is the second choice. Corresponding fortification levels were calculated for wheat flour intakes of < 150 g/day and > 300 g/day. Electrolytic iron is not recommended for flour intakes of < 150 g/day. Encapsulated ferrous sulfate or fumarate can be added at the same concentrations as the non-encapsulated compounds. For high-extraction wheat flour (> 0.8% ash), NaFeEDTA is the only iron compound recommended. Only nine national programs (Argentina, Chile, Egypt, Iran, Jordan, Lebanon, Syria, Turkmenistan, and Uruguay) were judged likely to have a significant positive impact on iron status if coverage is optimized. Most countries use non-recommended, low-bioavailability, atomized, reduced or hydrogen-reduced iron powders. Conclusion Most current iron fortification programs are likely to be ineffective. Legislation needs updating in many countries so that flour is fortified with adequate levels of the recommended iron compounds.
International Journal for Vitamin and Nutrition Research | 2004
Richard F. Hurrell; Sean R. Lynch; T. H. Bothwell; Héctor Cori; Ray Glahn; Hertrampf E; Zdenek Kratky; Dennis D. Miller; Mario Rodenstein; Hugo Streekstra; Birgit Teucher; Elizabeth Turner; Chi Kong Yeung; Michael B. Zimmermann
Iron deficiency remains a major global health problem affecting an estimated 2 billion people. The World Health Organization ranked it as the seventh most important preventable risk for disease, disability, and death in 2002. Since an important factor in its causation is the poor bioavailability of iron in the cereal-based diets of many developing countries, SUSTAIN set up a Task Force, consisting of nutritional, medical, industry, and government experts to consider strategies for enhancing the absorption of fortification iron. This paper summarizes the findings of this Task Force. Detailed reviews of each strategy follow this overview. Highly soluble compounds of iron like ferrous sulfate are desirable food fortificants but cannot be used in many food vehicles because of sensory issues. Thus, potentially less well-absorbed forms of iron commonly are used in food fortification. The bioavailability of iron fortificants can, however, be enhanced with innovative ingredient technologies. Ascorbic acid, NaFeEDTA, ferrous bisglycinate, and dephytinization all enhance the absorption of fortification iron, but add to the overall costs of fortification. While all strategies cannot be recommended for all food fortification vehicles, individual strategies can be recommended for specific foods. For example, the addition of ascorbic acid is appropriate for dry blended foods such as infant foods and other dry products made for reconstitution that are packaged, stored, and prepared in a way that maximizes retention of this vitamin. NaFeEDTA can be recommended for fortification of fish sauce and soy sauce, whereas amino acid chelates may be more useful in milk products and beverages. With further development, dephytinization may be possible for low-cost, cereal-based complementary foods in developing countries. Encapsulation of iron salts in lipid coatings, while not an iron absorption-enhancing strategy per se, can prevent soluble forms of iron from interacting undesirably with some food vehicles and hence broaden the application of some fortificants. Research relevant to each of these strategies for enhancing the bioavailability or utility of iron food fortificants is reviewed. Individual strategies are evaluated in terms of enhancing effect and stability, organoleptic qualities, cost, and regulatory issues of interest to the nutrition community, industry, and consumers. Recommendations are made on potential usages and further research needs. Effective fortification depends on the selection of technically feasible and efficacious strategies. Once suitable strategies have been identified, cost becomes very important in selecting the best approach to implement. However it is essential to calculate cost in relation to the amount of bioavailable iron delivered. An approach to the calculation of cost using a conservative estimate of the enhancing effects of the innovative technologies discussed in the supplement is given in the final section.
Gastroenterology | 1981
Barry S. Skikne; Sean R. Lynch; James D. Cook
Radioiron absorption tests in human volunteers demonstrated a modest but significant 28% reduction in the absorption of dietary nonheme iron from a meal that was preceded by the administration of 300 mg cimetidine. More pronounced decreases of 42% and 65% were observed with 600 and 900 mg cimetidine, respectively. Antacid caused a 52% decrease in iron absorption whereas pentagastrin had no significant effect. Since 300 mg cimetidine reduces gastric acid secretion by 60%-80% but iron absorption by only 28%, it appears that under normal conditions more gastric acid is secreted than is required for optimal iron absorption; absorption falls only when acid secretion is markedly reduced. Cimetidine in the doses currently recommended would not be expected to have a major effect on iron nutrition, although the combination of high doses of cimetidine with antacids would impair nonheme iron absorption significantly.
International Journal for Vitamin and Nutrition Research | 2005
Susan J. Fairweather-Tait; Sean R. Lynch; Christine Hotz; Richard F. Hurrell; Leo Abrahamse; Steve Beebe; Stine B. Bering; Klaus Bukhave; Ray Glahn; Michael Hambidge; Janet R. Hunt; Bo Lönnerdal; Denis R. Miller; Najat Mohktar; Penelope Nestel; Manju B. Reddy; Ann-Sofie Sandberg; Paul Sharp; Birgit Teucher; Trinidad P. Trinidad
A combination of dietary and host-related factors determines iron and zinc absorption, and several in vitro methods have been developed as preliminary screening tools for assessing bioavailability. An expert committee has reviewed evidence for their usefulness and reached a consensus. Dialyzability (with and without simulated digestion) gives some useful information but cannot predict the correct magnitude of response and may sometimes predict the wrong direction of response. Caco-2 cell systems (with and without simulated digestion) have been developed for iron availability, but the magnitude of different effects does not always agree with results obtained in human volunteers, and the data for zinc are too limited to draw conclusions about the validity of the method. Caco-2 methodologies vary significantly between laboratories and require experienced technicians and good quality cell culture facilities to obtain reproducible results. Algorithms can provide semi-quantitative information enabling diets to be classified as high, moderate, or low bioavailability. While in vitro methods can be used to generate ideas and develop hypotheses, they cannot be used alone for important decisions concerning food fortification policy, selection of varieties for plant breeding programs, or for new product development in the food industry. Ultimately human studies are required for such determinations.
Nutrition Reviews | 2002
Richard F. Hurrell; Thomas Bothwell; James D Cook; Omar Dary; Lena Davidsson; Susan J. Fairweather-Tait; Leif Hallberg; Sean R. Lynch; Jorge L. Rosado; Tomas Walter; Paul Whittaker
Fortification of cereal flours may be a useful public health strategy to combat iron deficiency. Cereal flours that are used shortly after production (e.g., baking flour) can be fortified with soluble iron compounds, such as ferrous sulfate, whereas the majority of flours stored for longer periods is usually fortified with elemental iron powders to avoid unacceptable sensory changes. Elemental iron powders are less well absorbed than soluble iron compounds and they vary widely in their absorption depending on manufacturing method and physicochemical characteristics. Costs vary with powder type, but elemental iron powders are generally less expensive than ferrous sulfate. This review evaluates the usefulness of the different elemental iron powders based on results from in vitro studies, rat assays, human bioavailability studies, and efficacy studies monitoring iron status in human subjects. It concludes that, at the present time, only electrolytic iron powder can be recommended as an iron fortificant. Because it is only approximately half as well absorbed as ferrous sulfate, it should be added to provide double the amount of iron.
Journal of Nutrition | 2011
Sean R. Lynch
The earliest studies of food iron absorption employing biosynthetically incorporated radioisotopes were published in the 1950s. Wheat flour has been fortified with iron in Canada, the United Kingdom, and the United States since the 1940s. However, half a century later, nutritional iron deficiency (ID) is estimated to affect 1.5-2 billion people worldwide. The reasons for the apparently limited impact of health and nutrition policies aimed at reducing the prevalence of ID in developing countries are complex. They include uncertainty about the actual prevalence of ID, particularly in regions where malaria and other infections are endemic, failure of policy makers to recognize the relationships between ID and both impaired productivity and increased morbidity, concerns about safety and the risks to iron-sufficient individuals if mass fortification is introduced, and technical obstacles that make it difficult to add bioavailable iron to the diets of those at greatest risk. It is, however, likely that the next decade will see a marked reduction in the prevalence of ID worldwide. More specific assessment tools are being standardized and applied to population surveys. The importance of preventing ID during critical periods of the life cycle is receiving increased attention. Innovative approaches to the delivery of bioavailable iron have been shown to be efficacious. The importance of integrating strategies to improve iron nutrition with other health measures, and economic and social policies addressing poverty as well as trade and agriculture, are receiving increasing consideration.
Nutrition Research Reviews | 2000
Sean R. Lynch
The experimental and epidemiological evidence demonstrating that Ca inhibits Fe absorption was reviewed, with the objectives of estimating the potential impact of variations in Ca intake on dietary Fe bioavailability and of providing some guidelines for predicting the effects on Fe status of recent recommendations for higher dietary Ca intake. In animal models Ca salts reduced both haem- and non-haem-Fe absorption, the effect being dependent on the amount of Ca administered rather than the Ca:Fe molar ratio; dairy products had a variable effect; factors other than Ca may have been important. In single-meal human absorption studies, both haem- and non-haem-Fe absorption was inhibited by Ca supplements and by dairy products, the effect depending on the simultaneous presence of Ca and Fe in the lumen of the upper small intestine and also occurring when Ca and Fe were given in the fasting state. The quantitative effect, although dose dependent, was modified by the form in which Ca was administered and by other dietary constituents (such as phosphate, phytate and ascorbic acid) known to affect Fe bioavailability. The mechanism by which Ca influences Fe absorption has not been elucidated. The effects of factors that modulate Fe bioavailability are known to be exaggerated in single-meal studies, and measurements based on several meals are more likely to reflect the true nutritional impact. The results of most multiple-meal human studies suggest that Ca supplementation will have only a small effect on Fe absorption unless habitual Ca consumption is very low. Outcome analyses showed that Ca supplements had no effect on Fe status in infants fed Fe-fortified formula, lactating women, adolescent girls and adult men and women. However it should be noted that the subjects studied had adequate intakes of bioavailable Fe and, except in one study, had relatively high habitual Ca intakes. Although cross-sectional analyses in Europe have shown a significant inverse correlation between Ca intake (derived primarily from dairy foods) and Fe stores, the quantitative effect was relatively small. The general conclusion is that dietary Ca supplements are unlikely to have a biologically significant impact on Fe balance in Western societies unless Ca consumption is habitually very low; however, increased consumption of dairy products may have a small negative effect that could be functionally important in pregnancy if Fe supplements are not taken. It is uncertain whether the inverse relationship between consumption of dairy products and Fe status is due entirely to increased Ca intake; substitution of milk proteins for meat may also have negative effects on Fe balance.
The American Journal of Clinical Nutrition | 1982
Sean R. Lynch; Clement A. Finch; Elaine R Monsen; James D. Cook
Studies of iron nutriture in the elderly are limited and very few include observations on individuals over the age of 75. The two Health and Nutrition Examination Surveys carried out by the United States Department of Health, Education and Welfare demonstrate that the mean iron intake of Americans is adequate until the age of 75. However, with changes in the major food sources there is a decrease in iron derived from meat and a concomitant rise in the proportion supplied by breakfast cereals. Alterations in dietary iron bioavailability++ that may result from this have not been studied. Physiological data suggest that the elderly do not represent a target population for iron deficiency since iron requirements are no greater than those of adult men and lower than those of children and menstruating women. Furthermore, there is little direct evidence of a high prevalence of iron deficiency in the elderly, but the laboratory measurements that have proved useful in defining iron status in younger people have not been standardized for or extensively used in older people. Anemia is still the most important known consequence of significant iron deficiency. However, the application of Hb or hematocrit standards used in younger people to the elderly as well as the assumption that anemia can be equated with iron deficiency invalidates the conclusions of many surveys. Hb and hematocrit measurements are not suitable screening tests for iron deficiency in the elderly and there is an urgent need for a clearer understanding of the physiological and nutritional factors responsible for lower Hb values in older people, particularly older Blacks.