January 2007 - Pro-pharmaceutical and anti-health freedom forces have made significant strides in recent years towards their ultimate aim of removing all higher-dose, innovative and effective nutritional supplements from the global marketplace. With the passing of the European Union’s Food Supplements Directivein 2002, for example, and the adoption by the Codex Alimentarius Commission of its global Guidelines for Vitamin and Mineral Food Supplementsin 2005, the stage became potentially set for stringent restrictions on the levels of nutrients contained in supplements to become enforceable on a global scale over the next few years.
The pro-pharmaceutical and anti-health freedom forces are trying to disguise this threat, however, by claiming that the upper safe levels for each nutrient will be calculated scientifically, via a process called “scientific risk assessment”. In reality, however, most current methodologies for assessing the supposed “risk” of consuming nutritionalsupplements are anything but scientific, and are actually deeply flawed.
Moreover, and as this article will show, unless these flaws are corrected, the eventual outcome, for many nutrients, could well be the global enforcement of maximum levels in supplements that are little different to the meagre government recommended daily allowances (RDAs). Should this happen, of course, the ultimate beneficiary would be the pharmaceutical industry and its “business with disease”.
The nutrient group approach versus the nutrient form approach
Arguably the most serious problem with most current nutrient risk assessment methodologies is that they are based upon the assessment of entire nutrient groups (e.g. vitamin D, vitamin E, calcium, zinc or iron), as opposed to individual nutrient forms (egg vitamin D3, gamma-tocopherol, calcium hydroxyapatite, zinc sulphate or iron bisglycinate).
Why is this important? Because, in short, the supposed “risks” from consuming a particular vitamin or a mineral are highly dependent upon the chemical form in which it is presented.
For example, ferrous sulphate, commonly prescribed by medical doctors to treat anaemia, is generally recognized as the most toxic form of iron, and can potentially cause a number of unpleasant side effects, including gastro-intestinal discomfort and nausea. However, iron bisglycinate, the form widely recommended by clinical nutritionists and nutritional therapists is a far gentler form of iron, and does not share the unpleasant the side effects of its relatively more toxic cousin.
Similarly, magnesium hydroxide can cause diarrhoea in some people, whilst other forms of this mineral, such as magnesium gluconate, are less likely to cause this problem.
Likewise, vitamin C, when taken as ascorbic acid, is well-known for its potential to cause looseness of the bowels when taken in very high doses. However, this side-effect does not occur when vitamin C is taken as calcium ascorbate.
Nevertheless, the risk assessment methodology advocated by the World Health Organization (WHO) and the Food and Agriculture Organization of the United Nations (FAO), in a joint report, published in 2005, recommends the setting of one single upper safe level for all forms of iron, one single upper safe level for all forms of magnesium, one single upper safe level for all forms of vitamin C, and so on. With this approach, therefore, the upper safe level for each nutrient source is essentially determined by the toxicity profile of that member of a given nutrient group shown to be the most toxic in the published, peer-reviewed scientific literature. In other words, the upper safe level for ferrous sulphate will also be applied to iron bisglycinate, the upper safe level for magnesium hydroxide will also be applied to magnesium gluconate, the upper safe level for ascorbic acid will also be applied to calcium ascorbate, and so on.