Saskia de Pee
World Food Programme
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Saskia de Pee.
Journal of Nutrition | 2010
Henk Jan Brinkman; Saskia de Pee; Issa Sanogo; Ludovic Subran; Martin W. Bloem
A global economic and financial crisis is engulfing the developing world, coming on top of high food and fuel prices. This paper assesses the impact of the crises on food consumption, nutrition, and health. Several methods were applied, including risk analysis using the cost of the food basket, assessment surveys, simulations, regression analysis using a food consumption score (FCS), reflecting diet frequency and diversity, and a review of the impact of such dietary changes on nutritional status and health. The cost of the food basket increased in several countries, forcing households to reduce quality and quantity of food consumed. The FCS, which is a measure of diet diversity, is negatively correlated with food prices. Simulations show that energy consumption declined during 2006-2010 in nearly all developing regions, resulting potentially in an additional 457 million people (of 4.5 billion) at risk of being hungry and many more unable to afford the dietary quality required to perform, develop, and grow well. As a result of the crises, large numbers of vulnerable households have reduced the quality and quantity of foods they consume and are at risk of increased malnutrition. Population groups most affected are those with the highest requirements, including young children, pregnant and lactating women, and the chronically ill (particularly people with HIV/AIDS and tuberculosis). Because undernutrition during the first 2 y of life has life-long consequences, even short-term price rises will have long-term effects. Thus, measures to mitigate the impact of the crises are urgently required.
The Lancet | 2008
Richard D. Semba; Saskia de Pee; Kai Sun; Mayang Sari; Nasima Akhter; Martin W. Bloem
BACKGROUND Child stunting is associated with poor child development and increased mortality. Our aim was to determine the effect of length of maternal and paternal education on stunting in children under the age of 5 years. METHODS Data for indicators of child growth and of parental education and socioeconomic status were gathered from 590,570 families in Indonesia and 395,122 families in Bangladesh as part of major nutritional surveillance programmes. FINDINGS The prevalence of stunting in families in Indonesia was 33.2%, while that in Bangladesh was 50.7%. In Indonesia, greater maternal formal education led to a decrease of between 4.4% and 5% in the odds of child stunting (odds ratio per year 0.950, 95% CI 0.946-0.954 in rural settings; 0.956, 0.950-0.961 in urban settings); greater paternal formal education led to a decrease of 3% in the odds of child stunting (0.970, 0.967-0.974). In Bangladesh, greater maternal formal education led to a 4.6% decrease in the odds of child stunting (0.954, 0.951-0.957), while greater paternal formal education led to a decrease of between 2.9% and 5.4% in the odds of child stunting (0.971, 0.969-0.974 in rural settings; 0.946, 0.941-0.951 in urban settings). In Indonesia, high levels of maternal and paternal education were both associated with protective caregiving behaviours, including vitamin A capsule receipt, complete childhood immunisations, better sanitation, and use of iodised salt (all p<0.0001). INTERPRETATION Both maternal and paternal education are strong determinants of child stunting in families in Indonesia and Bangladesh.
Food and Nutrition Bulletin | 2009
Saskia de Pee; Martin W. Bloem
Reducing child malnutrition requires nutritious food, breastfeeding, improved hygiene, health services, and (prenatal) care. Poverty and food insecurity seriously constrain the accessibility of nutritious diets that have high protein quality, adequate micronutrient content and bioavailability, macrominerals and essential fatty acids, low antinutrient content, and high nutrient density. Diets based largely on plant sources with few animal-source and fortified foods do not meet these requirements and need to be improved by processing (dehulling, germinating, fermenting), fortification, and adding animal-source foods, e.g., milk, or other specific nutrients. Options include using specially formulated foods (fortified blended foods, commercial infant cereals, or ready-to-use foods [RUFs; pastes, compressed bars, or biscuits]) or complementary food supplements (micronutrient powders or powdered complementary food supplements containing micronutrients, protein, amino acids, and/or enzymes or lipid-based nutrient supplements (120 to 250 kcal/day), typically containing milk powder, high-quality vegetable oil, peanut paste, sugar, and micronutrients. Most supplementary feeding programs for moderately malnourished children supply fortified blended foods, such as corn–soy blend, with oil and sugar, which have shortcomings, including too many antinutrients, no milk (important for growth), suboptimal micronutrient content, high bulk, and high viscosity. Thus, for feeding young or malnourished children, fortified blended foods need to be improved or replaced. Based on success with ready-to-use therapeutic foods (RUTFs) for treating severe acute malnutrition, modifying these recipes is also considered. Commodities for reducing child malnutrition should be chosen on the basis of nutritional needs, program circumstances, availability of commodities, and likelihood of impact. Data are urgently required to compare the impact of new or modified commodities with that of current fortified blended foods and of RUTF developed for treating severe acute malnutrition.
Journal of Nutrition | 2010
Andrew L. Thorne-Lyman; Natalie Valpiani; Kai Sun; Richard D. Semba; Christine L. Klotz; Klaus Kraemer; Nasima Akhter; Saskia de Pee; Regina Moench-Pfanner; Mayang Sari; Martin W. Bloem
In Bangladesh, rice prices are known to be positively associated with the prevalence of child underweight and inversely associated with household nongrain food expenditures, an indicator of dietary quality. The collection of reliable data on household expenditures is relatively time consuming and requires extensive training. Simple dietary diversity scores are increasingly used as measures of food security and as proxies for nutrient adequacy. This study examines associations between a simple dietary diversity score and commonly used indicators of socioeconomic status in Bangladesh. Data representative of rural Bangladesh was collected from 188,835 households over 18 rounds of bi-monthly data collection from 2003-2005. A simple household dietary diversity score was developed by summing the number of days each household consumed an item from each of 7 food groups over a 7-d period. The dietary diversity score was associated with per capita nongrain food expenditures (r = 0.415), total food expenditures (r = 0.327), and total household expenditures (r = 0.332) using Spearman correlations (all P < 0.0001). The frequency of meat and egg consumption showed greater variation across quintiles of total monthly expenditure than other items contributing to the dietary diversity score. After controlling for other measures of socioeconomic status in multiple linear regression models, the dietary diversity score was significantly associated with monthly per capita food and total expenditures. Low dietary diversity during the period prior to major food price increases indicates potential risk for worsening of micronutrient deficiencies and child malnutrition in Bangladesh.
Food and Nutrition Bulletin | 2010
Richard F. Hurrell; Peter Ranum; Saskia de Pee; Ralf Biebinger; Lena Hulthen; Quentin Johnson; Sean R. Lynch
Background Iron fortification of wheat flour is widely used as a strategy to combat iron deficiency. Objective To review recent efficacy studies and update the guidelines for the iron fortification of wheat flour. Methods Efficacy studies with a variety of iron-fortified foods were reviewed to determine the minimum daily amounts of additional iron that have been shown to meaningfully improve iron status in children, adolescents, and women of reproductive age. Recommendations were computed by determining the fortification levels needed to provide these additional quantities of iron each day in three different wheat flour consumption patterns. Current wheat flour iron fortification programs in 78 countries were evaluated. Results When average daily consumption of low-extraction (≤ 0.8% ash) wheat flour is 150 to 300 g, it is recommended to add 20 ppm iron as NaFeEDTA, or 30 ppm as dried ferrous sulfate or ferrous fumarate. If sensory changes or cost limits the use of these compounds, electrolytic iron at 60 ppm is the second choice. Corresponding fortification levels were calculated for wheat flour intakes of < 150 g/day and > 300 g/day. Electrolytic iron is not recommended for flour intakes of < 150 g/day. Encapsulated ferrous sulfate or fumarate can be added at the same concentrations as the non-encapsulated compounds. For high-extraction wheat flour (> 0.8% ash), NaFeEDTA is the only iron compound recommended. Only nine national programs (Argentina, Chile, Egypt, Iran, Jordan, Lebanon, Syria, Turkmenistan, and Uruguay) were judged likely to have a significant positive impact on iron status if coverage is optimized. Most countries use non-recommended, low-bioavailability, atomized, reduced or hydrogen-reduced iron powders. Conclusion Most current iron fortification programs are likely to be ineffective. Legislation needs updating in many countries so that flour is fortified with adequate levels of the recommended iron compounds.
Food and Nutrition Bulletin | 2005
Victor N. Bushamuka; Saskia de Pee; Aminuzzaman Talukder; Lynnda Kiess; Dora Panagides; Abu Taher; Martin W. Bloem
This paper assesses the additional benefits of a homestead gardening program designed to control vitamin A deficiency in Bangladesh. In February and March 2002, data were collected on the food security and social status of women from 2,160 households of active and former participants in the gardening program and from control groups in order to assess the impact and sustainability of the program. The proportions of active and former-participant households that gardened year-round were fivefold and threefold, respectively, higher than that of the control group (78% and 50% vs. 15%). In a three-month period, the households of active participants produced a median of 135 kg and consumed a median of 85 kg of vegetables, while the control households produced a median of 46 kg and consumed a median of 38 kg (p < .001). About 64% of the active-participant households generated a median garden income of 347 taka (US
The American Journal of Clinical Nutrition | 2009
Barbara Troesch; Ines Egli; Christophe Zeder; Richard F. Hurrell; Saskia de Pee; Michael B. Zimmermann
1 = 51 taka), which was spent mainly on food, and 25% of the control households generated 200 taka in the same period (p < .001). The garden production and income levels of formerly participating households three years after withdrawal of program support were much higher than those of the control households, illustrating the sustainability of the program and its ability to increase household food security. Significantly more women in active- and former-participant households than in control households perceived that they had increased their economic contribution to their households since the time the program was launched in their subdistricts (> 85% vs. 52%). Similar results were found for the level of influence gained by women on household decision-making. These results highlight the multiple benefits that homestead gardening programs can bring and demonstrate that these benefits should be considered when selecting nutritional and development approaches targeting poor households.
Bulletin of The World Health Organization | 2001
Mayang Sari; Saskia de Pee; Elviyanti Martini; Susilowati Herman; Sugiatmi; Martin W. Bloem; Ray Yip
BACKGROUND In-home fortification of complementary foods with micronutrient powders containing low amounts of iron may be potentially safer than powders containing high amounts of iron. However, low iron doses have little nutritional effect, unless iron absorption is high. OBJECTIVE The objective was to maximize iron absorption from a low-iron micronutrient powder for in-home fortification by testing combinations of iron as NaFeEDTA, ascorbic acid, and a microbial phytase active at gut pH. In addition, a recently proposed enhancer of iron absorption, L-alpha-glycerophosphocholine (GPC), was tested. DESIGN In 6 separate iron-absorption studies using a crossover design, women (n = 101) consumed whole-maize porridge fortified with 3 mg stable isotope-labeled FeSO4 or NaFeEDTA with different combinations of enhancers added to the meals at the time of consumption. Incorporation of iron isotopes into erythrocytes 14 d later was measured. RESULTS The addition of phytase when iron was present as either NaFeEDTA or FeSO4, with or without ascorbic acid, significantly increased iron absorption. The combined addition of phytase, ascorbic acid, and NaFeEDTA resulted in an absorption of 7.4%, compared with an absorption of 1.5% from FeSO4 without enhancers in the same meal (P < 0.001). The addition of ascorbic acid did not significantly increase iron absorption from NaFeEDTA, and the addition of calcium did not significantly inhibit iron absorption from NaFeEDTA in the presence of ascorbic acid. The addition of L-alpha-glycerophosphocholine did not significantly increase iron absorption. CONCLUSION Optimization of the micronutrient powder increased iron absorption from a highly inhibitory meal approximately 5-fold. This approach may allow for effective, untargeted in-home fortification of complementary foods with low amounts of highly bioavailable iron.
Food and Nutrition Bulletin | 2010
Saskia de Pee; Richard D. Semba
OBJECTIVE To determine the most effective method for analysing haemoglobin concentrations in large surveys in remote areas, and to compare two methods (indirect cyanmethaemoglobin and HemoCue) with the conventional method (direct cyanmethaemoglobin). METHODS Samples of venous and capillary blood from 121 mothers in Indonesia were compared using all three methods. FINDINGS When the indirect cyanmethaemoglobin method was used the prevalence of anaemia was 31-38%. When the direct cyanmethaemoglobin or HemoCue method was used the prevalence was 14-18%. Indirect measurement of cyanmethaemoglobin had the highest coefficient of variation and the largest standard deviation of the difference between the first and second assessment of the same blood sample (10-12 g/l indirect measurement vs 4 g/l direct measurement). In comparison with direct cyanmethaemoglobin measurement of venous blood, HemoCue had the highest sensitivity (82.4%) and specificity (94.2%) when used for venous blood. CONCLUSIONS Where field conditions and local resources allow it, haemoglobin concentration should be assessed with the direct cyanmethaemoglobin method, the gold standard. However, the HemoCue method can be used for surveys involving different laboratories or which are conducted in relatively remote areas. In very hot and humid climates, HemoCue microcuvettes should be discarded if not used within a few days of opening the container containing the cuvettes.
Food and Nutrition Bulletin | 2000
Aminuzzaman Talukder; Lynnda Kiess; Nasreen Huq; Saskia de Pee; Ian Darnton-Hill; Martin W. Bloem
Background HIV infection and malnutrition negatively reinforce each other. Objective For program guidance, to review evidence on the relationship of HIV infection and malnutrition in adults in resource-limited settings. Results and conclusions Adequate nutritional status supports immunity and physical performance. Weight loss, caused by low dietary intake (loss of appetite, mouth ulcers, food insecurity), malabsorption, and altered metabolism, is common in HIV infection. Regaining weight, particularly muscle mass, requires antiretroviral therapy (ART), treatment of opportunistic infections, consumption of a balanced diet, physical activity, mitigation of side effects, and perhaps appetite stimulants and growth hormone. Correcting nutritional status becomes more difficult as infection progresses. Studies document widespread micronutrient deficiencies among HIV-infected people. However, supplement composition, patient characteristics, and treatments vary widely across intervention studies. Therefore, the World Health Organization (WHO) recommends ensuring intake of 1 Recommended Nutrient Intake (RNI) of each required micronutrient, which may require taking micronutrient supplements. Few studies have assessed the impact of food supplements. Because the mortality risk in patients receiving ART increases with lower body mass index (BMI), improving the BMI seems important. Whether this requires provision of food supplements depends on the patients diet and food security. It appears that starting ART improves BMI and that ready-to-use fortified spreads and fortified-blended foods further increase BMI (the effect is somewhat less with fortified-blended foods). The studies are too small to assess effects on mortality. Once ART has been established and malnutrition treated, the nutritional quality of the diet remains important, also because of ARTs long-term metabolic effects (dyslipidemia, insulin resistance, obesity). Food insecurity should also be addressed if it prevents adequate energy intake and reduces treatment initiation and adherence (due to the opportunity costs of obtaining treatment and mitigating side effects).