Anorexia nervosa (AN) and bulimia nervosa are two common eating disorders.
AN is characterized by food restriction and irrational fear of gaining weight, as well as a distorted body self-image. Those suffering from AN often view themselves as “too fat” even if they are often emaciated. Recent studies show the onset age has recently decreased from an average of 15 years of age to 10 years of age. It occurs in ten times more females than males. The average caloric intake in AN is 500–800 calories per day, but extreme cases of complete self-starvation require intravenous feeding.
AN is a serious mental illness with a high incidence of comorbidity (especially autism and depression) and mortality similar to other serious psychiatric disorders. People suffering from AN have extremely high levels of ghrelin (the hunger hormone that signals a physiological desire for food) in their blood. The high levels of ghrelin suggest that their bodies are desperately trying to make them hungry; however, that hunger call is being ignored or suppressed.
AN is more prevalent in the upper social classes and it is thought to be rare in less-developed countries. It is more common at higher latitudes and some studies show emergency admissions for AN are seasonal.
Bulimia nervosa (BN) is an eating disorder characterized by binge eating and purging, or consuming a large amount of food in a short amount of time followed by an attempt to rid oneself of the food consumed (purging), typically by vomiting, taking a laxative, diuretic, or stimulant, and/or excessive exercise, because of an extensive concern for body weight.
Like most psychiatric disorders, there is a distinct seasonality of Google searches for eating disorders, searches peaking in the winter.
Recently, Doctor Karina Allen and colleagues, working under senior author Professor Andrew Whitehouse, all of the University of Western Australia, were the first to discover that low 25(OH)D levels during pregnancy increase the risk of eating disorders in their offspring in later adolescence.
The researchers looked at a cohort of 526 Caucasian mothers who had their 25(OH)D levels measured at 18 weeks of pregnancy and whose offspring were studied up to 20 years of age. The authors assessed eating disorder symptoms at ages 14, 17 and 20 years. Core analyses were limited to female offspring (n=308).
At 18 weeks gestation, quartile one’s 25(OH)D was between 6 –18 ng/ml; quartile 2 between 18–24 ng/ml; quartile 3 between 24–29 ng/ml and quartile 4 between 29–62 ng/ml.
At 20 years of age, eating disorders had been diagnosed in 98 of the 526 offspring studied, making the prevalence rate 18% by age 20.
Maternal 25(OH)D concentrations in the lowest quartile of 25(OH)D were associated with a significant two-fold increase in eating disorder risk in women, relative to concentrations in the highest quartile of 25(OH)D. Female participants born in spring were also significantly more likely to experience an eating disorder by age 20 years than participants born in winter, but this association did not hold up under multivariate analysis. No relationships were found for males.
The authors concluded:
“This study has provided new data to link low gestational 25(OH)D to increased eating disorder risk in female offspring of Caucasian mothers. This association may account for the season of birth effects observed in eating disorder groups previously. Ongoing research is required to extend our findings and to clarify the role of vitamin D in the pathogenesis of eating disorders. We recommend that our findings are viewed as preliminary, and as a basis for further research in this area.”