Made by DATEXIS (Data Science and Text-based Information Systems) at Beuth University of Applied Sciences Berlin
Deep Learning Technology: Sebastian Arnold, Betty van Aken, Paul Grundmann, Felix A. Gers and Alexander Löser. Learning Contextualized Document Representations for Healthcare Answer Retrieval. The Web Conference 2020 (WWW'20)
Funded by The Federal Ministry for Economic Affairs and Energy; Grant: 01MD19013D, Smart-MD Project, Digital Technologies
Novel zinc biomarkers, such as the erythrocyte LA:DGLA ratio, have shown promise in pre-clinical and clinical trials and are being developed to more accurately detect dietary zinc deficiency.
Global efforts to support national governments in addressing VAD are led by the Global Alliance for Vitamin A (GAVA), which is an informal partnership between A2Z, the Canadian International Development Agency, Helen Keller International, Micronutrient Initiative, UNICEF, USAID, and the World Bank. Joint GAVA activity is coordinated by the Micronutrient Initiative.
Vitamin Angels has committed itself to eradicating childhood blindness due to VAD on the planet by the year 2020. Operation 20/20 was launched in 2007 and will cover 18 countries. The program gives children two high-dose vitamin A and antiparasitic supplements (twice a year for four years), which provides children with enough of the nutrient during their most vulnerable years to prevent them from going blind and suffering from other life-threatening diseases related to VAD.
About 75% the vitamin A required for supplementation activity by developing countries is supplied by the Micronutrient Initiative with support from the Canadian International Development Agency.
An estimated 1.25 million deaths due to VAD have been averted in 40 countries since 1998.
In 2008, an estimated annual investment of US$60 million in vitamin A and zinc supplementation combined would yield benefits of more than US$1 billion per year, with every dollar spent generating benefits of more than US$17. These combined interventions were ranked by the Copenhagen Consensus 2008 as the world’s best development investment.
Measurements of a child’s growth provide the key information for the presence of malnutrition, but weight and height measurements alone can lead to failure to recognize kwashiorkor and an underestimation of the severity of malnutrition in children.
Adverse effects have been documented from vitamin B supplements, but never from food sources. Damage to the dorsal root ganglia is documented in human cases of overdose of pyridoxine. Although it is a water-soluble vitamin and is excreted in the urine, doses of pyridoxine in excess of the dietary upper limit (UL) over long periods cause painful and ultimately irreversible neurological problems. The primary symptoms are pain and numbness of the extremities. In severe cases, motor neuropathy may occur with "slowing of motor conduction velocities, prolonged F wave latencies, and prolonged sensory latencies in both lower extremities", causing difficulty in walking. Sensory neuropathy typically develops at doses of pyridoxine in excess of 1,000 mg per day, but adverse effects can occur with much less, so doses over 200 mg are not considered safe. Symptoms among women taking lower doses have been reported.
Existing authorizations and valuations vary considerably worldwide. As noted, the U.S. Institute of Medicine set an adult UL at 100 mg/day. The European Community Scientific Committee on Food defined intakes of 50 mg of vitamin B per day as harmful and established a UL of 25 mg/day. The nutrient reference values in Australia and New Zealand recommend an upper limit of 50 mg/day in adults. "The same figure was set for pregnancy and lactation as there is no evidence of teratogenicity at this level. The UL was set based on metabolic body size and growth considerations for all other ages and life stages except infancy. It was not possible to set a UL for infants, so intake is recommended in the form of food, milk or formula." The ULs were set using results of studies involving long-term oral administration of pyridoxine at doses of less than 1 g/day. "A no-observed-adverse-effect level (NOAEL) of 200 mg/day was identified from the studies of Bernstein & Lobitz (1988) and Del Tredici "et al" (1985). These studies involved subjects who had generally been on the supplements for five to six months or less. The study of Dalton and Dalton (1987), however, suggested the symptoms might take substantially longer than this to appear. In this latter retrospective survey, subjects who reported symptoms had been on supplements for 2.9 years, on average. Those reporting no symptoms had taken supplements for 1.9 years."
The assessment of vitamin B status is essential, as the clinical signs and symptoms in less severe cases are not specific. The three biochemical tests most widely used are the activation coefficient for the erythrocyte enzyme aspartate aminotransferase, plasma PLP concentrations, and the urinary excretion of vitamin B degradation products, specifically urinary PA. Of these, plasma PLP is probably the best single measure, because it reflects tissue stores. Plasma PLP less than 10 nmol/l is indicative of vitamin B deficiency. A PLP concentration greater than 20 nmol/l has been chosen as a level of adequacy for establishing Estimated Average Requirements and Recommended Daily Allowances in the USA. Urinary PA is also an indicator of vitamin B deficiency; levels of less than 3.0 mmol/day is suggestive of vitamin B deficiency.
The classic syndrome for vitamin B deficiency is rare, even in developing countries. A handful of cases were seen between 1952 and 1953, particularly in the United States, and occurred in a small percentage of infants who were fed a formula lacking in pyridoxine.
Treatment of VAD can be undertaken with both oral and injectable forms, generally as vitamin A palmitate.
- As an oral form, the supplementation of vitamin A is effective for lowering the risk of morbidity, especially from severe diarrhea, and reducing mortality from measles and all-cause mortality. Vitamin A supplementation of children under five who are at risk of VAD can reduce all‐cause mortality by 23%. Some countries where VAD is a public-health problem address its elimination by including vitamin A supplements available in capsule form with national immunization days (NIDs) for polio eradication or measles. Additionally, the delivery of vitamin A supplements, during integrated child health events such as child health days, have helped ensure high coverage of vitamin A supplementation in a large number of least developed countries. Child health events enable many countries in West and Central Africa to achieve over 80% coverage of vitamin A supplementation. According to UNICEF data, in 2013 worldwide, 65% of children between the ages of 6 and 59 months were fully protected with two high-dose vitamin A supplements. Vitamin A capsules cost about US$0.02. The capsules are easy to handle; they do not need to be stored in a refrigerator or vaccine carrier. When the correct dosage is given, vitamin A is safe and has no negative effect on seroconversion rates for oral polio or measles vaccines. However, because the benefit of vitamin A supplements is transient, children need them regularly every four to six months. Since NIDs provide only one dose per year, NIDs-linked vitamin A distribution must be complemented by other programs to maintain vitamin A in children Maternal high supplementation benefits both mother and breast-fed infant: high-dose vitamin A supplementation of the lactating mother in the first month postpartum can provide the breast-fed infant with an appropriate amount of vitamin A through breast milk. However, high-dose supplementation of pregnant women should be avoided because it can cause miscarriage and birth defects.
- Food fortification is also useful for improving VAD. A variety of oily and dry forms of the retinol esters, retinyl acetates, and retinyl palmitate are available for food fortification of vitamin A. Margarine and oil are the ideal food vehicles for vitamin A fortification. They protect vitamin A from oxidation during storage and prompt absorption of vitamin A. Beta-carotene and retinyl acetate or retinyl palmitate are used as a form of vitamin A for vitamin A fortification of fat-based foods. Fortification of sugar with retinyl palmitate as a form of vitamin A has been used extensively throughout Central America. Cereal flours, milk powder, and liquid milk are also used as food vehicles for vitamin A fortification. Genetic engineering is another method of food fortification, and this has been achieved with golden rice, but opposition to genetically modified foods has prevented its use as of July 2012.
- Dietary diversification can also control VAD. Nonanimal sources of vitamin A which contain preformed vitamin A account for greater than 80% of intake for most individuals in the developing world. The increase in consumption of vitamin A-rich foods of animal origin in addition to fruits and vegetables has beneficial effects on VAD. Researchers at the U. S. Agricultural Research Service have been able to identify genetic sequences in corn that are associated with higher levels of beta-carotene, the precursor to vitamin A. They found that breeders can cross certain variations of corn to produce a crop with an 18-fold increase in beta-carotene. Such advancements in nutritional plant breeding could one day aid in the illnesses related to VAD in developing countries.
Zinc deficiency can be classified as acute, as may occur during prolonged inappropriate zinc-free total parenteral nutrition; or chronic, as may occur in dietary deficiency or inadequate absorption.
The diagnostic workup of a suspected iodine deficiency includes signs and symptoms as well as possible risk factors mentioned above. A 24-hour urine iodine collection is a useful medical test, as approximately 90% of ingested iodine is excreted in the urine. For the standardized 24-hour test, a 50 mg iodine load is given first, and 90% of this load is expected to be recovered in the urine of the following 24 hours. Recovery of less than 90% is taken to mean high retention, that is, iodine deficiency. The recovery may, however, be well less than 90% during pregnancy, and an intake of goitrogens can alter the test results.
If a 24-hour urine collection is not practical, a random urine iodine-to-creatinine ratio can alternatively be used. However, the 24-hour test is found to be more reliable.
A general idea of whether a deficiency exists can be determined through a functional iodine test in the form of an iodine skin test. In this test, the skin is painted with an iodine solution: if the iodine patch disappears quickly, this is taken as a sign of iodine deficiency. However, no accepted norms exist on the expected time interval for the patch to disappear, and in persons with dark skin color the disappeance of the patch may be difficult to assess. If a urine test is taken shortly after, the results may be altered due to the iodine absorbed previously in a skin test.
In plants a micronutrient deficiency (or trace mineral deficiency) is a physiological plant disorder which occurs when a micronutrient is deficient in the soil in which a plant grows. Micronutrients are distinguished from macronutrients (nitrogen, phosphorus, sulfur, potassium, calcium and magnesium) by the relatively low quantities needed by the plant.
A number of elements are known to be needed in these small amounts for proper plant growth and development. Nutrient deficiencies in these areas can adversely affect plant growth and development. Some of the best known trace mineral deficiencies include: zinc deficiency, boron deficiency, iron deficiency, and manganese deficiency.
If untreated, pellagra can kill within four or five years. Treatment is with nicotinamide, which has the same vitamin function as niacin and a similar chemical structure, but has lower toxicity. The frequency and amount of nicotinamide administered depends on the degree to which the condition has progressed.
Micronutrient deficiencies affect more than two billion people of all ages in both developing and industrialized countries. They are the cause of some diseases, exacerbate others and are recognized as having an important impact on worldwide health. Important micronutrients include iodine, iron, zinc, calcium, selenium, fluorine, and vitamins A, B, B, B, B, B, and C.
Micronutrient deficiencies are associated with 10% of all children's deaths, and are therefore of special concern to those involved with child welfare. Deficiencies of essential vitamins or minerals such as Vitamin A, iron, and zinc may be caused by long-term shortages of nutritious food or by infections such as intestinal worms. They may also be caused or exacerbated when illnesses (such as diarrhoea or malaria) cause rapid loss of nutrients through feces or vomit.
Measures have been taken to reduce child malnutrition. Studies for the World Bank found that, from 1970 to 2000, the number of malnourished children decreased by 20 percent in developing countries. Iodine supplement trials in pregnant women have been shown to reduce offspring deaths during infancy and early childhood by 29 percent. However, universal salt iodization has largely replaced this intervention.
The Progresa program in Mexico combined conditional cash transfers with nutritional education and micronutrient-fortified food supplements; this resulted in a 10 percent reduction the prevalence of stunting in children 12–36 months old. Milk fortified with zinc and iron reduced the incidence of diarrhea by 18 percent in a study in India.
As always, laboratory values have to be interpreted with the lab's reference values in mind and considering all aspects of the individual clinical situation.
Serum ferritin can be elevated in inflammatory conditions; so a normal serum ferritin may not always exclude iron deficiency, and the utility is improved by taking a concurrent C-reactive protein (CRP). The level of serum ferritin that is viewed as "high" depends on the condition. For example, in inflammatory bowel disease the threshold is 100, where as in chronic heart failure (CHF) the levels are 200.
Iodine deficiency is treated by ingestion of iodine salts, such as found in food supplements. Mild cases may be treated by using iodized salt in daily food consumption, or drinking more milk, or eating egg yolks, and saltwater fish. For a salt and/or animal product restricted diet, sea vegetables (kelp, hijiki, dulse, nori (found in sushi)) may be incorporated regularly into a diet as a good source of iodine.
The recommended daily intake of iodine for adult women is 150–300 µg for maintenance of normal thyroid function; for men it is somewhat less at 150 µg.
However, too high iodine intake, for example due to overdosage of iodine supplements, can have toxic side effects. It can lead to hyperthyroidism and consequently high blood levels of thyroid hormones (hyperthyroxinemia). In case of extremely high single-dose iodine intake, typically a short-term suppression of thyroid function (Wolff–Chaikoff effect) occurs. Persons with pre-existing thyroid disease, elderly persons, fetuses and neonates, and patients with other risk factors are at a higher risk of experiencing iodine-induced thyroid abnormalities. In particular, in persons with goiter due to iodine deficiency or with altered thyroid function, a form of hyperthyroidism called Jod-Basedow phenomenon can be triggered even at small or single iodine dosages, for example as a side effect of administration of iodine-containing contrast agents. In some cases, excessive iodine contributes to a risk of autoimmune thyroid diseases (Hashimoto's thyroiditis and Graves' disease).
Pellagra can be common in people who obtain most of their food energy from maize, notably rural South America, where maize is a staple food. If maize is not nixtamalized, it is a poor source of tryptophan, as well as niacin. Nixtamalization corrects the niacin deficiency, and is a common practice in Native American cultures that grow corn. Following the corn cycle, the symptoms usually appear during spring, increase in the summer due to greater sun exposure, and return the following spring. Indeed, pellagra was once endemic in the poorer states of the U.S. South, such as Mississippi and Alabama, where its cyclical appearance in the spring after meat-heavy winter diets led to it being known as "spring sickness" (particularly when it appeared among more vulnerable children), as well as among the residents of jails and orphanages as studied by Dr. Joseph Goldberger.
Pellagra is common in Africa, Indonesia, and China. In affluent societies, a majority of patients with clinical pellagra are poor, homeless, alcohol-dependent, or psychiatric patients who refuse food. Pellagra was common among prisoners of Soviet labor camps (the Gulag). In addition, pellagra, as a micronutrient deficiency disease, frequently affects populations of refugees and other displaced people due to their unique, long-term residential circumstances and dependence on food aid. Refugees typically rely on limited sources of niacin provided to them, such as groundnuts; the instability in the nutritional content and distribution of food aid can be the cause of pellagra in displaced populations. In the 2000s, there were outbreaks in countries such as Angola, Zimbabwe and Nepal. In Angola specifically, recent reports show a similar incidence of pellagra since 2002 with clinical pellagra in 0.3% of women and 0.2% of children and niacin deficiency in 29.4% of women and 6% of children related to high untreated corn consumption.
In other countries such as the Netherlands and Denmark, even with sufficient intake of niacin, cases have been reported. In this case deficiency might happen not just because of poverty or malnutrition but secondary to alcoholism, drug interaction (psychotropic, cytostatic, tuberclostatic or analgesics), HIV, vitamin B and B deficiency, or malabsorption syndromes such as Hartnup and carcinoid.
Hypothermia can occur. To prevent or treat this, the child can be kept warm with covering including of the head or by direct skin-to-skin contact with the mother or father and then covering both parent and child. Prolonged bathing or prolonged medical exams should be avoided. Warming methods are usually most important at night.
Manufacturers are trying to fortify everyday foods with micronutrients that can be sold to consumers such as wheat flour for Beladi bread in Egypt or fish sauce in Vietnam and the iodization of salt.
For example, flour has been fortified with iron, zinc, folic acid and other B vitamins such as thiamine, riboflavin, niacin and vitamin B12.
Iron is needed for bacterial growth making its bioavailability an important factor in controlling infection. Blood plasma as a result carries iron tightly bound to transferrin, which is taken up by cells by endocytosing transferrin, thus preventing its access to bacteria. Between 15 and 20 percent of the protein content in human milk consists of lactoferrin that binds iron. As a comparison, in cow's milk, this is only 2 percent. As a result, breast fed babies have fewer infections. Lactoferrin is also concentrated in tears, saliva and at wounds to bind iron to limit bacterial growth. Egg white contains 12% conalbumin to withhold it from bacteria that get through the egg shell (for this reason, prior to antibiotics, egg white was used to treat infections).
To reduce bacterial growth, plasma concentrations of iron are lowered in a variety of systemic inflammatory states due to increased production of hepcidin which is mainly released by the liver in response to increased production of pro-inflammatory cytokines such as Interleukin-6. This functional iron deficiency will resolve once the source of inflammation is rectified; however, if not resolved, it can progress to Anaemia of Chronic Inflammation. The underlying inflammation can be caused by fever, inflammatory bowel disease, infections, Chronic Heart Failure (CHF), carcinomas, or following surgery.
Reflecting this link between iron bioavailability and bacterial growth, the taking of oral iron supplements in excess of 200 mg/day causes a relative overabundance of iron that can alter the types of bacteria that are present within the gut. There have been concerns regarding parenteral iron being administered whilst bacteremia is present, although this has not been borne out in clinical practice. A moderate iron deficiency, in contrast, can provide protection against acute infection, especially against organisms that reside within hepatocytes and macrophages, such as malaria and tuberculosis. This is mainly beneficial in regions with a high prevalence of these diseases and where standard treatment is unavailable.
Diagnosis typically is based on physical signs, X-rays, and improvement after treatment.
Iron deficiency can be avoided by choosing appropriate soil for the growing conditions (e.g., avoid growing acid loving plants on lime soils), or by adding well-rotted manure or compost. If iron deficit chlorosis is suspected then check the pH of the soil with an appropriate test kit or instrument. Take a soil sample at surface and at depth. If the pH is over seven then consider soil remediation that will lower the pH toward the 6.5 - 7 range. Remediation includes: i) adding compost, manure, peat or similar organic matter (warning. Some retail blends of manure and compost have pH in the range 7 - 8 because of added lime. Read the MSDS if available. Beware of herbicide residues in manure. Source manure from a certified organic source.) ii) applying Ammonium Sulphate as a Nitrogen fertilizer (acidifying fertilizer due to decomposition of ammonium ion to nitrate in the soil and root zone) iii) applying elemental Sulphur to the soil (oxidizes over the course of months to produce sulphate/sulphite and lower pH). Note: adding acid directly e.g. sulphuric/hydrochloric/citric acid is dangerous as you may mobilize metal ions in the soil that are toxic and otherwise bound. Iron can be made available immediately to the plant by the use of iron sulphate or iron chelate compounds. Two common iron chelates are Fe EDTA and Fe EDDHA. Iron sulphate (Iron(II)_sulfate) and iron EDTA are only useful in soil up to PH 7.1 but they can be used as a foliar spray (Foliar_feeding). Iron EDDHA is useful up to PH 9 (highly alkaline) but must be applied to the soil and in the evening to avoid photodegradation. EDTA in the soil may mobilize Lead, EDDHA does not appear to.
Detecting phosphorus deficiency can take multiple forms. A preliminary detection method is a visual inspection of plants. Darker green leaves and purplish or red pigment can indicate a deficiency in phosphorus. This method however can be an unclear diagnosis because other plant environment factors can result in similar discoloration symptoms. In commercial or well monitored settings for plants, phosphorus deficiency is diagnosed by scientific testing. Additionally, discoloration in plant leaves only occurs under fairly severe phosphorus deficiency so it is beneficial to planters and farmers to scientifically check phosphorus levels before discoloration occurs. The most prominent method of checking phosphorus levels is by soil testing. The major soil testing methods are Bray 1-P, Mehlich 3, and Olsen methods. Each of these methods are viable but each method has tendencies to be more accurate in known geographical areas. These tests use chemical solutions to extract phosphorus from the soil. The extract must then be analyzed to determine the concentration of the phosphorus. Colorimetry is used to determine this concentration. With the addition of the phosphorus extract into a colorimeter, there is visual color change of the solution and the degree to this color change is an indicator of phosphorus concentration. To apply this testing method on phosphorus deficiency, the measured phosphorus concentration must be compared to known values. Most plants have established and thoroughly tested optimal soil conditions. If the concentration of phosphorus measured from the colorimeter test is significantly lower than the plant’s optimal soil levels, then it is likely the plant is phosphorus deficient. The soil testing with colorimetric analysis, while widely used, can be subject to diagnostic problems as a result of interference from other present compounds and elements. Additional phosphorus detection methods such as spectral radiance and inductively coupled plasma spectrometry (ICP) are also implemented with the goal of improving reading accuracy. According to the World Congress of Soil Scientists, the advantages of these light-based measurement methods are their quickness of evaluation, simultaneous measurements of plant nutrients, and their non-destructive testing nature. Although these methods have experimental based evidence, unanimous approval of the methods has not yet been achieved.
Scurvy can be prevented by a diet that includes vitamin C-rich foods such as bell peppers (sweet peppers), blackcurrants, broccoli, chili peppers, guava, kiwifruit, and parsley. Other sources rich in vitamin C are fruits such as lemons, oranges, papaya, and strawberries. It is also found in vegetables, such as brussels sprouts, cabbage, potatoes, and spinach. Some fruits and vegetables not high in vitamin C may be pickled in lemon juice, which is high in vitamin C. Though redundant in the presence of a balanced diet, various nutritional supplements are available that provide ascorbic acid well in excess of that required to prevent scurvy.
Some animal products, including liver, Muktuk (whale skin), oysters, and parts of the central nervous system, including the adrenal medulla, brain, and spinal cord, contain large amounts of vitamin C, and can even be used to treat scurvy. Fresh meat from animals which make their own vitamin C (which most animals do) contains enough vitamin C to prevent scurvy, and even partly treat it. In some cases (notably French soldiers eating fresh horse meat), it was discovered that meat alone, even partly cooked meat, could alleviate scurvy. Conversely, in other cases, a meat-only diet could cause scurvy.
Scott's 1902 Antarctic expedition used lightly fried seal meat and liver, whereby complete recovery from incipient scurvy was reported to have taken less than two weeks.
Supplemental zinc can prevent iron absorption, leading to iron deficiency and possible peripheral neuropathy, with loss of sensation in extremities. Zinc and iron should be taken at different times of the day.
Correction and prevention of phosphorus deficiency typically involves increasing the levels of available phosphorus into the soil. Planters introduce more phosphorus into the soil with bone meal, rock phosphate,manure, and phosphate-fertilizers. The introduction of these compounds into the soil however does not ensure the alleviation of phosphorus deficiency. There must be phosphorus in the soil, but the phosphorus must also be absorbed by the plant. The uptake of phosphorus is limited by the chemical form in which the phosphorus is available in the soil. A large percentage of phosphorus in soil is present in chemical compounds that plants are incapable of absorbing. Phosphorus must be present in soil in specific chemical arrangements to be usable as plant nutrients. Facilitation of usable phosphorus in soil can be optimized by maintaining soil within a specified pH range. Soil acidity, measured on the pH scale, partially dictates what chemical arrangements that phosphorus forms. Between pH 6 and 7, phosphorus makes the fewest number of bonds which render the nutrient unusable to plants. At this range of acidity the likeliness of phosphorus uptake is increased and the likeliness of phosphorus deficiency is decreased. Another component in the prevention and treatment of phosphorus is the plant’s disposition to absorb nutrients. Plant species and different plants within in the same species react differently to low levels of phosphorus in soil. Greater expansion of root systems generally correlate to greater nutrient uptake. Plants within a species that have larger roots are genetically advantaged and less prone to phosphorus deficiency. These plants can be cultivated and bred as a long term phosphorus deficiency prevention method. In conjunction to root size, other genetic root adaptations to low phosphorus conditions such as mycorrhizal symbioses have been found to increase nutrient intake. These biological adaptations to roots work to maintain the levels of vital nutrients. In larger commercial agriculture settings, variation of plants to adopt these desirable phosphorus intake adaptations may be a long-term phosphorus deficiency correction method.
Fertilisers like ammonium phosphate, calcium ammonium nitrate, urea can be supplied. Foliar spray of urea can be a quick method.