Made by DATEXIS (Data Science and Text-based Information Systems) at Beuth University of Applied Sciences Berlin
Deep Learning Technology: Sebastian Arnold, Betty van Aken, Paul Grundmann, Felix A. Gers and Alexander Löser. Learning Contextualized Document Representations for Healthcare Answer Retrieval. The Web Conference 2020 (WWW'20)
Funded by The Federal Ministry for Economic Affairs and Energy; Grant: 01MD19013D, Smart-MD Project, Digital Technologies
The current reference range for acceptable blood lead concentrations in healthy persons without excessive exposure to environmental sources of lead is less than 5 µg/dL for children. It was less than 25 µg/dL for adults. Previous to 2012 the value for children was 10 (µg/dl). The current biological exposure index (a level that should not be exceeded) for lead-exposed workers in the U.S. is 30 µg/dL in a random blood specimen.
In 2015, US HHS/CDC/NIOSH designated 5 µg/dL (five micrograms per deciliter) of whole blood, in a venous blood sample, as the reference blood lead level for adults. An elevated BLL is defined as a BLL ≥5 µg/dL. This case definition is used by the ABLES program, the Council of State and Territorial Epidemiologists (CSTE), and CDC’s National Notifiable Diseases Surveillance System (NNDSS). Previously (i.e. from 2009 until November 2015), the case definition for an elevated BLL was a BLL ≥10 µg/dL. The U.S. national BLL geometric mean among adults was 1.2 μg/dL in 2009–2010.
Blood lead concentrations in poisoning victims have ranged from 30->80 µg/dL in children exposed to lead paint in older houses, 77–104 µg/dL in persons working with pottery glazes, 90–137 µg/dL in individuals consuming contaminated herbal medicines, 109–139 µg/dL in indoor shooting range instructors and as high as 330 µg/dL in those drinking fruit juices from glazed earthenware containers.
Most pesticide-related illnesses have signs and symptoms that are similar to common medical conditions, so a complete and detailed environmental and occupational history is essential for correctly diagnosing a pesticide poisoning. A few additional screening questions about the patient's work and home environment, in addition to a typical health questionnaire, can indicate whether there was a potential pesticide poisoning.
If one is regularly using carbamate and organophosphate pesticides, it is important to obtain a baseline cholinesterase test. Cholinesterase is an important enzyme of the nervous system, and these chemical groups kill pests and potentially injure or kill humans by inhibiting cholinesterase. If one has had a baseline test and later suspects a poisoning, one can identify the extent of the problem by comparison of the current cholinesterase level with the baseline level.
A number of measurements exist to assess exposure and early biological effects for organophosphate poisoning. Measurements of OP metabolites in both the blood and urine can be used to determine if a person has been exposed to organophosphates. Specifically in the blood, metabolites of cholinesterases, such as butyrylcholinesterase (BuChE) activity in plasma, neuropathy target esterase (NTE) in lymphocytes, and of acetylcholinesterase (AChE) activity in red blood cells. Due to both AChE and BuChE being the main targets of organophosphates, their measurement is widely used as an indication of an exposure to an OP. The main restriction on this type of diagnosis is that depending on the OP the degree to which either AChE or BuChE are inhibited differs; therefore, measure of metabolites in blood and urine do not specify for a certain OP. However, for fast initial screening, determining AChE and BuChE activity in the blood are the most widely used procedures for confirming a diagnosis of OP poisoning. The most widely used portable testing device is the Test-mate ChE field test, which can be used to determine levels of Red Blood Cells (RBC), AChE and plasma (pseudo) cholinesterase (PChE) in the blood in about four minutes. This test has been shown to be just as effective as a regular laboratory test and because of this, the portable ChE field test is frequently used by people who work with pesticides on a daily basis.
Diagnosis of elemental or inorganic mercury poisoning involves determining the history of exposure, physical findings, and an elevated body burden of mercury. Although whole-blood mercury concentrations are typically less than 6 μg/L, diets rich in fish can result in blood mercury concentrations higher than 200 μg/L; it is not that useful to measure these levels for suspected cases of elemental or inorganic poisoning because of mercury's short half-life in the blood. If the exposure is chronic, urine levels can be obtained; 24-hour collections are more reliable than spot collections. It is difficult or impossible to interpret urine samples of patients undergoing chelation therapy, as the therapy itself increases mercury levels in the samples.
Diagnosis of organic mercury poisoning differs in that whole-blood or hair analysis is more reliable than urinary mercury levels.
Mercury thermometers and mercury light bulbs are not as common as they used to be, and the amount of mercury they contain is unlikely to be a health concern if handled carefully. However, broken items still require careful cleanup, as mercury can be hard to collect and it is easy to accidentally create a much larger exposure problem.
Diagnosis includes determining the clinical signs and the medical history, with inquiry into possible routes of exposure. Clinical toxicologists, medical specialists in the area of poisoning, may be involved in diagnosis and treatment.
The main tool in diagnosing and assessing the severity of lead poisoning is laboratory analysis of the blood lead level (BLL).
Blood film examination may reveal basophilic stippling of red blood cells (dots in red blood cells visible through a microscope), as well as the changes normally associated with iron-deficiency anemia (microcytosis and hypochromasia). However, basophilic stippling is also seen in unrelated conditions, such as megaloblastic anemia caused by vitamin B12 (colbalamin) and folate deficiencies.
Exposure to lead also can be evaluated by measuring erythrocyte protoporphyrin (EP) in blood samples. EP is a part of red blood cells known to increase when the amount of lead in the blood is high, with a delay of a few weeks. Thus EP levels in conjunction with blood lead levels can suggest the time period of exposure; if blood lead levels are high but EP is still normal, this finding suggests exposure was recent. However, the EP level alone is not sensitive enough to identify elevated blood lead levels below about 35 μg/dL. Due to this higher threshold for detection and the fact that EP levels also increase in iron deficiency, use of this method for detecting lead exposure has decreased.
Blood lead levels are an indicator mainly of recent or current lead exposure, not of total body burden. Lead in bones can be measured noninvasively by X-ray fluorescence; this may be the best measure of cumulative exposure and total body burden. However this method is not widely available and is mainly used for research rather than routine diagnosis. Another radiographic sign of elevated lead levels is the presence of radiodense lines called lead lines at the metaphysis in the long bones of growing children, especially around the knees. These lead lines, caused by increased calcification due to disrupted metabolism in the growing bones, become wider as the duration of lead exposure increases. X-rays may also reveal lead-containing foreign materials such as paint chips in the gastrointestinal tract.
Fecal lead content that is measured over the course of a few days may also be an accurate way to estimate the overall amount of childhood lead intake. This form of measurement may serve as a useful way to see the extent of oral lead exposure from all the diet and environmental sources of lead.
Lead poisoning shares symptoms with other conditions and may be easily missed. Conditions that present similarly and must be ruled out in diagnosing lead poisoning include carpal tunnel syndrome, Guillain–Barré syndrome, renal colic, appendicitis, encephalitis in adults, and viral gastroenteritis in children. Other differential diagnoses in children include constipation, abdominal colic, iron deficiency, subdural hematoma, neoplasms of the central nervous system, emotional and behavior disorders, and intellectual disability.
Accidental poisonings can be avoided by proper labeling and storage of containers. When handling or applying pesticides, exposure can be significantly reduced by protecting certain parts of the body where the skin shows increased absorption, such as the scrotal region, underarms, face, scalp, and hands. Safety protocols to reduce exposure include the use of personal protective equipment, washing hands and exposed skin during as well as after work, changing clothes between work shifts, and having first aid trainings and protocols in place for workers.
Personal protective equipment for preventing pesticide exposure includes the use of a respirator, goggles, and protective clothing, which have all have been shown to reduce risk of developing pesticide-induced diseases when handling pesticides. A study found the risk of acute pesticide poisoning was reduced by 55% in farmers who adopted extra personal protective measures and were educated about both protective equiment and pesticide exposure risk. Exposure can be significantly reduced when handling or applying pesticides by protecting certain parts of the body where the skin shows increased absorption, such as the scrotal region, underarms, face, scalp, and hands. Using chemical-resistant gloves has been shown to reduce contamination by 33–86%.
Pesticides exposure cannot be studied in placebo controlled trials as this would be unethical. A definitive cause effect relationship therefore cannot be established. Consistent evidence can and has been gathered through other study designs. The precautionary principle is thus frequently used in environmental law such that absolute proof is not required before efforts to decrease exposure to potential toxins are enacted.
The American Medical Association recommend limiting exposure to pesticides. They came to this conclusion due to the fact that surveillance systems currently in place are inadequate to determine problems related to exposure. The utility of applicator certification and public notification programs are also of unknown value in their ability to prevent adverse outcomes.
Hippuric acid has long been used as an indicator of toluene exposure; however, there appears to be some doubt about its validity. There is significant endogenous hippuric acid production by humans; which shows inter- and intra-individual variation influenced by factors such as diet, medical treatment, alcohol consumption, etc. This suggests that hippuric acid may be an unreliable indicator of toluene exposure. It has been suggested that urinary hippuric acid, the traditional marker of toluene exposure is simply not sensitive enough to separate the exposed from the non-exposed. This has led to the investigation of other metabolites as markers for toluene exposure.
Urinary "o"-cresol may be more reliable for the biomonitoring of toluene exposure because, unlike hippuric acid, "o"-cresol is not found at detectable levels in unexposed subjects. o-Cresol may be a less sensitive marker of toluene exposure than hippuric acid. o-Cresol excretion may be an unreliable method for measuring toluene exposure because o-cresol makes up <1% of total toluene elimination.
Benzylmercapturic acid, a minor metabolite of toluene, is produced from benzaldehyde. In more recent years, studies have suggested the use of urinary benzylmercapturic acid as the best marker for toluene exposure, because: it is not detected in non-exposed subjects; it is more sensitive than hippuric acid at low concentrations; it is not affected by eating or drinking; it can detect toluene exposure down to approximately 15 ppm; and it shows a better quantitative relationship with toluene than hippuric acid or "o"-cresol.
In cases of suspected copper poisoning, penicillamine is the drug of choice, and dimercaprol, a heavy metal chelating agent, is often administered. Vinegar is not recommended to be given, as it assists in solubilizing insoluble copper salts. The inflammatory symptoms are to be treated on general principles, as are the nervous ones.
There is some evidence that alpha-lipoic acid (ALA) may work as a milder chelator of tissue-bound copper. Alpha lipoic acid is also being researched for chelating other heavy metals, such as mercury.
Diagnosis is primarily anecdotal, that is, it depends upon a good occupational history. Diagnosis of metal fume fever can be easily missed because the complaints are non-specific, resemble a number of other common illnesses, and presentation occurs typically 2–4 hours after the exposure. When respiratory symptoms are prominent, metal fume fever may be confused with acute bronchitis or pneumonia. The diagnosis is based primarily upon a history of exposure to metal oxide fumes. Cain and Fletcher (2010) report a case of metal fume fever that was diagnosed only by taking a full occupational history and by close collaboration between primary and secondary health care personnel.
Physical symptoms vary among persons exposed, depending largely upon the stage in the course of the syndrome during which examination occurs. Patients may present with wheezing or crackles in the lungs. They typically have an increased white blood cell count, and urine, blood plasma and skin zinc levels may (unsurprisingly) be elevated. Chest X-ray abnormalities may also be present.
An interesting feature of metal fume fever involves rapid adaptation to the development of the syndrome following repeated metal oxide exposure. Workers with a history of recurrent metal fume fever often develop a tolerance to the fumes. This tolerance, however, is transient, and only persists through the work week. After a weekend hiatus, the tolerance has usually disappeared. This phenomenon of tolerance is what led to the name "Monday Fever".
In 2006, approximately 700 metal fume exposures were reported to the United States Poison control center. The American Welding Society estimated that 2500 employees in the steel industry develop metal fume fever in the US each year and that the majority of the cases are not reported.
There are relatively simple tests for radon gas. Radon test kits are commercially available. The short-term radon test kits used for screening purposes are inexpensive, in many cases free. Discounted test kits can be purchased online through The National Radon Program Services at Kansas State University or through state radon offices. Information about local radon zones and specific state contact information can be accessed through the EPA Map at https://www.epa.gov/radon/find-information-about-local-radon-zones-and-state-contact-information. The kit includes a collector that the user hangs in the lowest livable floor of the dwelling for 2 to 7 days. Charcoal canisters are another type of short-term radon test, and are designed to be used for 2 to 4 days. The user then sends the collector to a laboratory for analysis. Both devices are passive, meaning that they do not need power to function.
It should be noted that the accuracy of the residential radon test depends upon the lack of ventilation in the house when the sample is being obtained. Thus, the occupants will be instructed not to open windows, etc., for ventilation during the pendency of test, usually two days or more.
Long-term kits, taking collections for 3 months up to one year, are also available. An open-land test kit can test radon emissions from the land before construction begins. A Lucas cell is one type of long-term device. A Lucas cell is also an active device, or one that requires power to function. Active devices provide continuous monitoring, and some can report on the variation of radon and interference within the testing period. These tests usually require operation by trained testers and are often more expensive than passive testing. The National Radon Proficiency Program (NRPP) provides a list of radon measurement professionals.
Radon levels fluctuate naturally. An initial test might not be an accurate assessment of a home's average radon level. Transient weather can affect short term measurements. Therefore, a high result (over 4 pCi/L) justifies repeating the test before undertaking more expensive abatement projects. Measurements between 4 and 10 pCi/L warrant a long-term radon test. Measurements over 10 pCi/L warrant only another short-term test so that abatement measures are not unduly delayed. Purchasers of real estate are advised to delay or decline a purchase if the seller has not successfully abated radon to 4 pCi/L or less.
Since radon concentrations vary substantially from day to day, single grab-type measurements are generally not very useful, except as a means of identifying a potential problem area, and indicating a need for more sophisticated testing. The EPA recommends that an initial short-term test be performed in a closed building. An initial short-term test of 2 to 90 days allows residents to be informed quickly in case a home contains high levels of radon. Long-term tests provide a better estimate of the average annual radon level.
Prevention of metal fume fever in workers who are at risk (such as welders) involves avoidance of direct contact with potentially toxic fumes, improved engineering controls (exhaust ventilation systems), personal protective equipment (respirators), and education of workers regarding the features of the syndrome itself and proactive measures to prevent its development.
In some cases, the product's design may be changed so as to eliminate the use of risky metals. NiCd rechargeable batteries are being replaced by NiMH. These contain other toxic metals, such as chromium, vanadium and cerium. Cadmium is often replaced by other metals. Zinc or nickel plating can be used instead of cadmium plating, and brazing filler alloys now rarely contain cadmium.
Many studies have examined the effects of pesticide exposure on the risk of cancer. Associations have been found with: leukemia, lymphoma, brain, kidney, breast, prostate, pancreas, liver, lung, and skin cancers. This increased risk occurs with both residential and occupational exposures. Increased rates of cancer have been found among farm workers who apply these chemicals. A mother's occupational exposure to pesticides during pregnancy is associated with an increases in her child's risk of leukemia, Wilms' tumor, and brain cancer. Exposure to insecticides within the home and herbicides outside is associated with blood cancers in children.
OSHA has set safety standards for grinding and sharpening copper and copper alloy tools, which are often used in nonsparking applications. These standards are recorded in the Code of Federal Regulations 29 CFR 1910.134 and 1910.1000.
Note: The most important nonsparking copper alloy is beryllium copper, and can lead to beryllium poisoning.
Increased concentrations of urinary beta-2 microglobulin can be an early indicator of renal dysfunction in persons chronically exposed to low but excessive levels of environmental cadmium. The urinary beta-2 microglobulin test is an indirect method of measuring cadmium exposure. Under some circumstances, the Occupational Health and Safety Administration requires screening for renal damage in workers with long-term exposure to high levels of cadmium. Blood or urine cadmium concentrations provide a better index of excessive exposure in industrial situations or following acute poisoning, whereas organ tissue (lung, liver, kidney) cadmium concentrations may be useful in fatalities resulting from either acute or chronic poisoning. Cadmium concentrations in healthy persons without excessive cadmium exposure are generally less than 1 μg/L in either blood or urine. The ACGIH biological exposure indices for blood and urine cadmium levels are 5 μg/L and 5 μg/g creatinine, respectively, in random specimens. Persons who have sustained renal damage due to chronic cadmium exposure often have blood or urine cadmium levels in a range of 25-50 μg/L or 25-75 μg/g creatinine, respectively. These ranges are usually 1000-3000 μg/L and 100-400 μg/g, respectively, in survivors of acute poisoning and may be substantially higher in fatal cases.
Organophosphate pesticides are one of the top causes of poisoning worldwide, with an annual incidence of poisonings among agricultural workers varying from 3-10% per country.
The current mainstay of manganism treatment is levodopa and chelation with EDTA. Both have limited and at best transient efficacy. Replenishing the deficit of dopamine with levodopa has been shown to initially improve extrapyramidal symptoms, but the response to treatment goes down after 2 or 3 years, with worsening condition of the same patients noted even after 10 years since last exposure to manganese. Enhanced excretion of manganese prompted by chelation therapy brings its blood levels down but the symptoms remain largely unchanged, raising questions about efficacy of this form of treatment.
Increased ferroportin protein expression in human embryonic kidney (HEK293) cells is associated with decreased intracellular manganese concentration and attenuated cytotoxicity, characterized by the reversal of Mn-reduced glutamate uptake and diminished lactate dehydrogenase (LDH) leakage.
DIagnosis - I'm guessing in addition to evaluation (and rule out of) symptoms mentioned previously, there should maybe be mention of a blood test here since it's already been described as a way to gauge concentration -- "high blood concentrations lead to ____" stated in previous section so we can conclude that it can be measured in the blood and there's some existing accepted level of what level equals "high" as opposed to "normal".
Please delete, thank you and my apologies, logic is all I've got.
Prevention of Kashin–Beck disease has a long history. Intervention strategies were mostly based on one of the three major theories of its cause.
Selenium supplementation, with or without additional antioxidant therapy (vitamin E and vitamin C) has been reported to be successful, but in other studies no significant decrease could be shown compared to a control group. Major drawbacks of selenium supplementation are logistic difficulties (daily or weekly intake, drug supply), potential toxicity (in case of less controlled supplementation strategies), associated iodine deficiency (that should be corrected before selenium supplementation to prevent further deterioration of thyroid status) and low compliance. The latter was certainly the case in Tibet, where a selenium supplementation has been implemented from 1987 to 1994 in areas of high endemicity.
With the mycotoxin theory in mind, backing of grains before storage was proposed in Guangxi province, but results are not reported in international literature. Changing from grain source has been reported to be effective in Heilongjiang province and North Korea.
With respect to the role of drinking water, changing of water sources to deep well water has been reported to decrease the X-ray metaphyseal detection rate in different settings.
In general, the effect of preventive measures however remains controversial, due to methodological problems (no randomised controlled trials), lack of documentation or, as discussed above, due to inconsistency of results.
Cadmium is a naturally occurring toxic heavy metal with common exposure in industrial workplaces, plant soils, and from smoking. Due to its low permissible exposure to humans, overexposure may occur even in situations where trace quantities of cadmium are found. Cadmium is used extensively in electroplating, although the nature of the operation does not generally lead to overexposure. Cadmium is also found in some industrial paints and may represent a hazard when sprayed. Operations involving removal of cadmium paints by scraping or blasting may pose a significant hazard. Cadmium is also present in the manufacturing of some types of batteries. Exposures to cadmium are addressed in specific standards for the general industry, shipyard employment, construction industry, and the agricultural industry.
In epidemiology, environmental diseases are diseases that can be directly attributed to environmental factors (as distinct from genetic factors or infection). Apart from the true monogenic genetic disorders, environmental diseases may determine the development of disease in those genetically predisposed to a particular condition. Stress, physical and mental abuse, diet, exposure to toxins, pathogens, radiation, and chemicals found in almost all personal care products and household cleaners are possible causes of a large segment of non-hereditary disease. If a disease process is concluded to be the result of a combination of genetic and "environmental factor" influences, its etiological origin can be referred to as having a multifactorial pattern.
There are many different types of environmental disease including:
- Lifestyle disease such as cardiovascular disease, diseases caused by substance abuse such as alcoholism, and smoking-related disease
- Disease caused by physical factors in the environment, such as skin cancer caused by excessive exposure to ultraviolet radiation in sunlight
- Disease caused by exposure to toxic or irritant chemicals in the environment such as toxic metals
==Environmental Diseases vs. Pollution-
Related Diseases==
Environmental diseases are a direct result from the environment. This includes diseases caused by substance abuse, exposure to toxic chemicals, and physical factors in the environment, like UV radiation from the sun, as well as genetic predisposition. Meanwhile, pollution-related diseases are attributed to exposure to toxins in the air, water, and soil. Therefore all pollution-related disease are environmental diseases, but not all environmental diseases are pollution-related diseases.
Serious adverse behavioural effects are often associated with chronic occupational exposure and toluene abuse related to the deliberate inhalation of solvents. Long-term toluene exposure is often associated with effects such as: psychoorganic syndrome; visual evoked potential (VEP) abnormality; toxic polyneuropathy, cerebellar, cognitive, and pyramidal dysfunctions; optic atrophy; and brain lesions.
The neurotoxic effects of long-term use (in particular repeated withdrawals) of toluene may cause postural tremors by upregulating GABA receptors within the cerebellar cortex. Treatment with GABA agonists such as benzodiazepines provide some relief from toluene-induced tremor and ataxia. An alternative to drug treatment is vim thalamotomy. The tremors associated with toluene misuse do not seem to be a transient symptom, but an irreversible and progressive symptom which continues after solvent abuse has been discontinued.
There is some evidence that low-level toluene exposure may cause disruption in the differentiation of astrocyte precursor cells. This does not appear to be a major hazard to adults; however, exposure of pregnant women to toluene during critical stages of fetal development could cause serious disruption to neuronal development.
Diagnosis of clinical poisoning is generally made by documenting exposure, identifying the neurologic signs, and analyzing serum for alpha-mannosidase activity and swainsonine.
In mule deer, clinical signs of locoism are similar to chronic wasting disease. Histological signs of vacuolation provide a differential diagnosis.
Sub-clinical intoxication has been investigated in cattle grazing on "Astragalus mollissimus". As the estimated intake of swainsonine increased, blood serum alpha-mannosidase activity and albumin decreased, and alkaline phosphatase and thyroid hormone increased.
When exposure to a carcinogenic substance is suspected, the cause/effect relationship on any given case can never be ascertained. Lung cancer occurs spontaneously, and there is no difference between a "natural" cancer and another one caused by radon (or smoking). Furthermore, it takes years for a cancer to develop, so that determining the past exposure of a case is usually very approximative. The health effect of radon can only be demonstrated through theory and statistical observation.
The study design for epidemiological methods may be of three kinds:
- The best proofs come from observations of cohorts (predetermined populations with known exposures and exhaustive follow-up), such as those on miners, or on Hiroshima and Nagasaki survivors. Such studies are efficient, but very costly when the population needs to be a large one. Such studies can only be used when the effect is strong enough, hence, for high exposures.
- Alternate proofs are case-control studies (the environment factors of a "case" population is individually determined, and compared to that of a "control″ population, to see what the difference might have been, and which factors may be significant), like the ones that have been used to demonstrate the link between lung cancer and smoking. Such studies can identify key factors when the signal/noise ratio is strong enough, but are very sensitive to selection bias, and prone to the existence of confounding factors.
- Lastly, ecological studies may be used (where the global environment variables and their global effect on two different populations are compared). Such studies are "cheap and dirty": they can be easily conducted on very large populations (the whole USA, in Dr Cohen's study), but are prone to the existence of confounding factors, and exposed to the ecological fallacy problem.
Furthermore, theory and observation must confirm each other for a relationship to be accepted as fully proven. Even when a statistical link between factor and effect appears significant, it must be backed by a theoretical explanation; and a theory is not accepted as factual unless confirmed by observations.