Authored by Fikiru Dasa
A healthful diet (adequate, moderate, balanced and varied) is essential for the maintenance of health and disease prevention. Vitamin A, iodine, iron and zinc deficiencies are the key vitamins and minerals which affects more than 2 billion peoples worldwide and most of these are found in developing countries. Pregnant and lactating women and young children are most vulnerable to micronutrient malnutrition due to their higher demands. As intervention different stakeholders have been using fortification of food with essential micronutrients as one of the main strategies to address micronutrient deficiencies for more than half of century. However, the use of micronutrient is not without risk. Consumption of micronutrient fortified foods for a long period of time may result in side effects ranging from mild to severe toxicity. The objective of this seminar is to review the health risks from consumption of micronutrient fortified foods.
Keywords: Food fortification; Fortified foods; Micronutrients; Toxicity; Health risks
Introduction
For the maintenance of health and disease prevention, adequate vitamin and mineral intakes are essential. More than 2 billion people are deficient in key vitamins and minerals, particularly vitamin A, iodine, iron and zinc [1]. Micronutrient deficiency can affect all age groups, but young children and women of reproductive age tend to be among the most at risk of developing micronutrient deficiencies [2,3]. Most of these affected populations are from developing countries where lower incomes, lack of access to a wider variety of micronutrient-rich and fortified foods, and inadequate health services, are all factors that contribute to the increasing of the risk and prevalence of micronutrient malnutrition (MNM). Dietary diversification, food fortification, and supplementation are the three food-based approaches that are aimed at increasing the intake of micronutrients. The approach of food fortification is to fortify food with essential nutrients.Food fortification has a long history of use in industrialized countries for the successful control of deficiencies of vitamins A and D, several B vitamins (thiamine, riboflavin and niacin), iodine and iron. Burgi, et al. [4] revealed that salt iodization was introduced in the early 1920s in both Switzerland and the United States of America and has since expanded progressively all over the world to the extent that iodized salt is now used in most countries. From the early 1940s onwards, the fortification of cereal products with thiamine, riboflavin and niacin became common practice [5]. Foods for young children were fortified with iron, a practice which has substantially reduced the risk of iron-deficiency anemia in this age group [6]. In more recent years, folic acid fortification of wheat has become widespread in the Americas, a strategy adopted by Canada and the United States and about 20 Latin American countries [7].
Over the past few decades, food fortification and infant formula with high levels of vitamins and minerals have led to a sharp increase in vitamin intake among infants, children and adults. This is followed by a sharp increase in the prevalence of obesity and other related diseases [8]. This shows that consumption of a diet that fortified with micronutrients for long time can result in adverse health effects. In other words, micronutrient fortification is recommended in many regions where mild to moderate deficiency is prevalent; however, there is uncertainty about both the longterm benefit and the safety of micronutrient fortification in such regions, especially where the deficiency is mild.
Thus, this review is aimed to review the long-term health risks of consuming micronutrient fortified foods for on human health. In this review paper all relevant evidences from different sources have been reviewed systematically based on some important questions and using key words related to the title.
Micronutrient-Fortified Foods: Definition, Concept and Principles
Food fortification is defined as the deliberate addition of one or more micronutrients to particular foods, with the outcome of to increase the intake of these micronutrient(s) in order to prevent a demonstrated deficiency and provide a health benefit. The joint WHO/FAO guidelines on food fortification distinguishes three approaches to food fortification: mass, targeted, and market-driven [9]. Mass fortification refers to the addition of micronutrients to edible products that are consumed regularly by the general public, such as cereals, oils and vegetable fats, milk, and condiments [9,10]. Targeted fortification refers to the fortification of foods designed for specific population subgroups, such as complementary weaning foods for infants, foods for institutional programs aimed at schoolchildren or preschoolers, and foods used under emergency situations [11]. Market driven fortification refers to a situation in which a food manufacturer takes the initiative to add one or more micronutrients to processed foods in order to increase sales and profits [9]. Universal fortification refers to fortification of foods consumed by animals as well as humans, with iodization of salt as the main example.Nestel, et al. [12,13] defined the household level of food fortification approach is a combination of supplementation and fortification for young children are under way and has been referred to as complementary food supplementation. One such approach involves the addition of a commercial micronutrient premix, available in sachets, to small batches of flour during the milling process. The potential for plant breeding/biofortification of staple foods to increase the micronutrient content of various cereals, legumes and tubers certainly exists; for instance, it is possible to select certain cereals (such as rice) and legumes for their high iron content, various varieties of carrots and sweet potatoes for their favorable carotene levels, and maize for their low phytate content (which improves the absorption of iron and zinc) [9].
As a principle, the FDA, (1980) as cited in Dwyer, et al. [14] established its food fortification policy with 6 basic principles: the nutrient intake without fortification is below the desirable content for a significant portion of the population, the food being fortified is consumed in quantities that would make a significant contribution to the population’s intake of the nutrient, the additional nutrient intake resulting from fortification is unlikely to create an imbalance of essential nutrients, the nutrient added is stable under proper conditions of storage and use, the nutrient is physiologically available from the food to which it is being added, and there is reasonable assurance that it will not result in potentially toxic intakes. In addition, FDA has stated that decisions relative to food fortification should be based primarily on clinical and biochemical data rather than on dietary data alone.
Micronutrient Fortification and Public Health Need
Iron, vitamin A and iodine deficiency are the three most common forms of MNM [12]. Of these, iron deficiency is the most prevalent. It is estimated that over 2 billion people are anemic, under 2 billion have inadequate iodine nutrition [15] and 254 million preschool-aged children are vitamin A deficient. From a public health perspective, MNM is not only a concern that a large number of people are affected, but also because it’s being a risk factor for many diseases and can contribute to high rates of morbidity and even [9]. Several authors estimated the magnitude of these common forms of micronutrient malnutrition.Iron (Fe) deficiency anemia is a worldwide public health problem; global prevalence is estimated at 24.8 % and the highest prevalence occurs in sub-Saharan Africa and south-central Asia [16]. It can negatively affect growth, motor and cognitive development of approximately 40-60% of the developing world’s children [1,17] and immune function. Fe fortification of foods has been suggested as a cost-effective, long-term, population-based strategy with better compliance to improve Fe status and to prevent Fe deficiency worldwide [9,18]. The prevalence of Fe deficiency in developed countries has declined substantially in the past 15–20 years due to the introduction of fortified foods [19] and other public health programmed such as nutritional advice, emphasis on breast-feeding, education and hygiene [20,21].
Iodine is required for normal physical growth during gestation and early life. It is an essential component of the hormones produced by the thyroid gland. Because of dietary iodine deficiency, synthesis of the thyroid hormones is impaired. The resulting hypothyroidism induced by iodine deficiency during pregnancy, infancy and childhood can impair growth and development [22,15,23,24]. Iodine deficiency is estimated to have lowered the intellectual capacity of most developing countries by 10-15% [25].
Vitamin A is a fat–soluble vitamin that is necessary for multiple functions in the mammalian body. It supports growth, reproduction, vision, and immune function. Vitamin A deficiency is compromising the immune systems of approximately 40% of the world’s underfives and leading the deaths of approximately 1 million young children each year [26]. In order to address the concern of VAD on public health, fortification of some staple and processed foods has been implemented in many countries [27].
Health Risks Associated with Consumption of Micronutrient Fortified Foods
The aim of food fortification is to provide a significant level of the nutrient at regular consumption of the food vehicle. The levels also need to take into account variations in food consumption so that the safety of those at the higher end of the scale and impact for those at the lower end are ensured. They should also consider prorated intakes by young children to ensure efficacious and safe dosages [28].An adverse health effect has been defined as any impairment of a physiologically important function and as any change in morphology, physiology, growth, development or life span of an organism. Vitamins and minerals perform specific and vital functions in a variety of body systems and are crucial for maintaining optimal health [29] and hence, used in food fortification strategies. However, in a long period consumption of micronutrient fortified foods may cause toxicity due to the accumulation of these nutrients in human body.
As it’s introduced earlier, Vitamin A, iodine and iron are the most common forms of micronutrient deficiencies and as an intervention peoples are using these micronutrients in a form of fortified foods and supplementation. Government, food industries and different stakeholders and international organizations are using food fortification technique to combat micronutrient deficiencies. To apply the food fortification strategy, it’s important to pinpoint and identify the risk factors. So far, no studies had been conducted on the health effects of micronutrient fortified foods and supplementation on human health.
Food fortification may lead to differential exposure to synthetic vitamins. Accumulation of excess vitamins cause side effects ranging from mild (i.e., nausea, vomiting, and abdominal pain) to severe toxicity resulting in increased intracranial pressure [30], liver abnormalities [31]. In addition, there is an evidence that some vitamins (beta-carotene, vitamin A, and vitamin E) seem to increase mortality [32]. Because Vitamin A is fat soluble, and excess of this vitamin can be stored in the liver and may result in hypervitaminosis A. From this point of view, it seems unnecessary to take vitamins every day, although estimated daily average requirements and the recommended dietary allowances.
Similarly, minerals are of critical importance in the diet, even though they comprise only 4–6% of the human body. Each element has its individual impact in the structural and functional integrity of the living cells and organisms. Iron is a potent pro-oxidant and cannot be actively excreted by humans [33]. Jaeggi, et al. [34] revealed that provision of iron-containing micronutrient powders to weaning infants adversely affects the gut microbiome, increasing pathogen abundance and causing intestinal inflammation. Because iron is not easily eliminated from the body, attention has been paid to circumstances in which excess iron may be absorbed or used inappropriately. Iannotti et al. [35] revealed that an overabundance of iron may catalyze the generation of hydroxyl radicals through the Fenton reaction. In addition, they reported that chronic iron overload has been studied in the context of hemochromatosis, and these studies may provide insight into the mechanisms and clinical manifestations of excess iron. Several authors reported the negative effect of iron supplements on growth in iron-replete young children [36,37] and addition of galacto-oligosaccharides mitigates most of the adverse effects of iron on the gut microbiome and morbidity in African infants [38]. However, no studies have been conducted on the toxicity of iron in relation to fortified food consumption.
Excessive intake of iodine may be associated with complications such as iodine-induced hyperthyroidism in some cases or hypothyroidism in others. Data indicate a small increase in the risk for iodine-induced hyperthyroidism with increasing iodine intakes in older adults, mainly those with pre-existing nodular goiter [39,40].
Conclusion
Food fortification is one of the most cost-effective nutrition interventions to tackle hidden hunger on a large scale. It is evident that nutritional fortification of foods has been very effective in the past in eliminating widespread nutritional deficiencies. Consumers are seeking food with health benefits in an era of complex diseases. It must be determined how to provide the nutrients to those people who need the intervention while avoiding imbalanced or excessive intakes for other groups. This is a risk and benefit calculation that is dependent on the distribution of nutritional requirements and susceptibility to toxicity, neither which has been determined for many nutrients. So, as a recommendation:• Research is needed to identify the effect of micronutrient fortified foods on human health
• Researchers and experts should identify the risk factors for the deficiency before looking to food fortification program
• Periodic evaluation of the micronutrient status of those groups in risk once the fortification starts
• The fortification program should be time bounded.
To read more about this article....Open access Journal of Nutrition & Food Science
Please follow the URL to access more information about this article
https://irispublishers.com/gjnfs/fulltext/health-risks-from-long-term-consumption-of-micronutrient-fortified-foods.ID.000513.php
No comments:
Post a Comment