Mitochondrial dysfunction and autism

Beautiful pic of mitochondria

Mitochondria are the powerhouses of the cell, the biology teachers will tell you. These organelles also happen to be likely former bacteria that once were independently living cells, capable of dividing on their own to make new mitochondria. Indeed, they continue to divide by a kind of binary fission as our cells divide, ensuring that a double dose is available for partitioning into the two new cells that result from cell division.

To achieve these feats, mitochondria have their own DNA, their own proteins, and their own protein-making machinery. That means that they also have the potential to undergo genetic mutations that affect the sequence of the proteins their genes encode. Because most of the proteins in mitochondria are mission critical and must function exactly right, the persistence of such mutations is relatively rare. But they do happen, causing disease. One question that has arisen in the study of the causes of autism is whether or not such changes might underlie at least a portion of the cases of this developmental difference.

The high-profile Hannah Poling case

Certainly lending a high profile to this question was the case of Hannah Poling, whose mitochondrial disorder appeared to be linked to her autism symptoms and may have interacted with a bolus of vaccine doses she received, followed by a high fever. Fevers can tax our cellular powerhouses, and if mitochondrial function is already compromised, the high temperatures and extra burden may result in chronic negative outcomes.

Poling’s case brought to the forefront the question of whether or not people with autism might have mitochondrial dysfunction at greater rates. A recent study in the Journal of the American Medical Association (which steadfastly keeps its articles unavailable behind a paywall) has sought to address that question by measuring markers of mitochondrial dysfunction in children with autism and comparing these endpoints with outcomes in children without autism.

Study specifics: “Full-syndrome autism”

The autistic group in the study had what the researchers called “full syndrome autism,” which I take to mean intense symptoms of autism. They used the Autism Diagnostic Inventory-Revised

(ADI-R) and the Autism Diagnostic Observation Schedule (ADOS) to confirm this diagnosis and to ensure as uniform a population among their autistic group as possible. Ultimately, the study included 10 children in this group, recruited consecutively in the clinic based on their fulfillment of the selection criteria. This study was essentially case control, meaning that the control group consisted of 10 non-autistic children, selected to match as closely as possible the demographic characteristics of the autistic group.

The authors report that while only one child among the 10 who were autistic fulfilled the definitive criteria for a mitochondrial respiratory chain disorder, the children with autism were more likely to have indicators of mitochondrial dysfunction.

A problem with pyruvate dehydrogenase (break out your Krebs notes, folks)

Specifically, six out of ten showed lowered levels of activity for one parameter, while eight out of ten showed higher levels than controls for another metabolic endpoint, and two of ten showed higher levels than controls of a third metabolic endpoint. Overall, the results indicated low activity of a mitochondria-specific enzyme, pyruvate dehydrogenase, which is involved in one of the first steps of carbohydrate metabolism that takes place in the mitochondria. Reduced activity of an enzyme anywhere in this process will result in changes in the enzyme’s own products and products further down the pathway and throw off mitochondrial function. Further, half of the autistic group exhibited higher levels of DNA replication, an indicator of cellular stress, more frequently than controls and also had more deletions in their DNA than controls. Statistical analysis suggested that all of these differences were significant.

What does it mean for autism?

Do these findings mean that all or most people with autism have mitochondrial dysfunction? No. The study results do not support that conclusion. Further, the authors themselves list six limitations of the study. These include the possibility that some findings of statistical significance could be in error because of sample size or confounders within the sample and that there were changes in some of the endpoints in the autistic group in both directions. In other words, some autistic children had much higher values than controls, while some had lower values, muddying the meaning of the statistics. The authors note that a study like this one does not allow anyone to draw conclusions about a cause-and-effect association between autism and mitochondria, and they urge caution with regard to generalizing the findings to a larger population.

If there is an association, questions arise from that conclusion. Does mitochondrial dysfunction underlie autism, producing autistic-like symptoms, as some argued in the Hannah Poling case? Or, do autistic manifestations such as anxiety or high stress or some other autism-related factor influence the mitochondria?

Chickens, eggs, MRI, mitochondria, autism

As interesting as both of these recent autism-related studies are, we still have the “Which came first” question to deal with. Did autism cause the brain or mitochondrial differences, or did the brain or mitochondrial differences trigger the autism? Right now, these chicken-and-egg questions may not matter as much as the findings do for helping to identify autism more specifically and addressing some of its negative aspects. Regardless of your stance on neurodiversity or vaccine or acceptance or cure or the in-betweens where most of us fall, it would be difficult to argue that a mitochondrial dysfunction shouldn’t be identified and ameliorated or that an awareness of brain structure differences won’t lead to useful information about what drives autism behaviors.

——————————————–

Note: More lay-accessible versions of this post and the previous post are available at BlogHer.

Genetic analysis: my results and my reality

A few months ago, it was National DNA Day or something like that, and one of the genetics analysis companies had a sale on their analysis kits, offering a full panel of testing for only $100. Giddy with the excitement of saving almost $1000 on something I’d long been interested in doing, I signed on, ordering one kit each for my husband (a.k.a. “The Viking”) and me. Soon, we found ourselves spending a romantic evening spitting into vials and arguing about whether or not we’d shaken them long enough before packaging them.

The company promised results in six weeks, but they came much faster than that, in about three weeks. Much to my relief, I learned that neither of us carries markers for cystic fibrosis and that I lack either of the two main mutations related to breast cancer. Those basic findings out of the way, things then got more complex and more interesting.

How it works

First, a bit of background. These tests involve sequencing of specific regions to look for very small changes, a single nucleotide, in the DNA. If there is a study that has linked a specific mutation to a change in the average risk for a specific disorder or trait, then the company notes that. The more data there are supporting that link, the stronger the company indicates the finding is. Thus, four gold stars in their nomenclature means, “This is pretty well supported,” while three or fewer slate stars means, “There are some data for this but not a lot,” or “The findings so far are inconsistent.”

Vikings and Ireland

The Viking is a private person, so I can’t elaborate on his findings here except to say that (a) he is extraordinarily healthy in general and (b) what we thought was a German Y chromosome seems instead to be strictly Irish and associated with some Irish king known as Niall of the Nine Hostages. Why hostages and why nine, I do not know. But it did sort of rearrange our entire perception of his Y chromosome and those of our three sons to find this out. For the record, it matches exactly what we learned from participating in the National Geographic Genographic project. I’d ask the Viking if he were feeling a wee bit o’ the leprachaun, but given his somewhat daunting height and still Viking-ish overall demeanor (that would be thanks to his Scandinavian mother), I’m thinking he doesn’t. Lord of the Dance, he is not.

Markers that indicate an increased risk for me

I have an increased risk of…duh

Looking at the chart to the left (it’s clickable), you can see where I earned myself quite a few four gold stars, but the ones that seem most relevant are those with a 2x or greater increased risk: lupus, celiac disease, and glaucoma. The first two do not surprise me, given my family’s history of autoimmune disorders.

If you focus on a list like this too long, you can start to get a serious case of hypochondria, worrying that you’re gonna get all of these things thanks to those glaring golden stars. But to put it into context, for the lupus–for which my risk is 2.68 times higher than a regular gal’s–that still leaves me in the population in which 0.66 persons out of every 100 will develop this disorder. Compare that to the 0.25 out of every 100 in the regular-gal population, and it doesn’t strike me as that daunting.

Some of those other things on there? Well, let’s just say they’re close. My risk of thyroid cancer might be raised…but I no longer have a thyroid. Hypertension risk is increased–and I have stage 2 hypertension. Gallstones, gout, alcholism, asthma…based on family history, it’s no surprise to me to see some mixed or clear risk involved with these, although I have none of them. Does that mean that someone else with these increased risks will have related real-life findings? No. It only means that you’re at a bit more risk. It’s like riding a motorcycle vs. driving a car. The former carries more risk of a fatal wreck, but that doesn’t mean you’re absolutely gonna die on it if you ride it.

Disorders for which my risk is allegedly decreased

I have a decreased risk of...

None of my decreased risk findings are very eye catching in terms of actual drop in risk except for Type II diabetes (now where is my bag of sugar?). As I have been under evaluation for multiple sclerosis and have a family member with it, it’s interesting to see that my risk for it, based on existing studies and known polymorphisms, is decreased. And even though I know that much of this is largely speculative and based on little firm data, it’s still sort of comforting to see “decreased risk” and things like “melanoma” in the same group.

Don’t make my brown eyes blue!

And they didn’t. They nailed the eye color and other trait-related analysis, such as level of curl to the hair, earwax type, alcohol flush reaction, lactose intolerance (unlikely), and muscle performance (I am not nor have I ever been a sprinter). And even though I do not have red hair, they reported that I had a good chance of it, also true given family history. I am not resistant to malaria but allegedly resistant to norovirus. I wish someone had informed my genes of that in 2003 when I was stricken with a horrible case of it.

Ancestral homeland

Yep. They nailed this one. One hundred percent European mutt. Mitochondria similar to…Jesse James…part of a haplogroup that originated in the Near East about 45,000 years ago then traveled to Ethiopia and Egypt and from there, presumably, into Europe. It’s a pretty well traveled haplotype and happens to match exactly with the one identified by the National Geographic Genographic project. When it comes to haplotypes, we’re batting 1000.

In summary

Some of these findings are reliable, such as the absence of the standard breast cancer mutations or the presence of certain mutations related to autoimmune disorders, while other findings are iffy. The company duly notes their iffiness  in the reports, along with the associated citations, polymorphisms, and level of risk identified in each study. They don’t promise to tell you that your ancestors lived in a castle 400 years ago or hailed from Ghana. From this company, at any rate, the results are precise and precisely documented, and as I noted, pretty damned accurate. And they’re careful to be a clear as possible about what “increased risk” or “decreased risk” really means.

It’s fascinating to me that a little bit of my spit can be so informative, even down to my eye color, hair curl, and tendency to hypertension, and I’ve noted that just in the days since we received our results, they’ve continually updated as new data have come in. Would I be so excited had I paid $1100 for this instead of $200? As with any consideration of the changes in risk these analyses identified, that answer would require context. Am I a millionaire? Or just a poor science writer? Perhaps my genes will tell.

The piggish origins of civilization

Follow the pig

For researchers interested in tracing the path of human civilization from its birthplace in the Fertile Crescent to the rest of the world, they need only follow the path of the pig.

Pig toting

Until this research was reported, humans agreed that pigs had fallen under our magical domestication powers only twice about 9,000 years ago, once in what is called the Near East (Turkey), and a second time in what is called the Far East (China). Morphological and some genetic evidence seemed to point to these two events only. That led human geographers to conclude that humans must have toted domesticated pigs around from the Far or Near East to other parts of the world like Europe or Africa, rather than domesticating the wild boars they encountered in every new locale.

Occam’s Razor violation

As it turns out, those ideas—which enjoyed the support even of Charles Darwin—were wrong. And they provide a nice example of a violation Occam’s Razor, the rule that scientists should select the explanation that requires the fewest assumptions. In the case of the pig, two domestication events definitely required fewer assumptions than the many that we now believe to have occurred.

Research published in the journal Science in 2005 has identified at least seven occurrences of the domestication of wild boars. Two events occurred in Turkey and China, as previously thought, but the other five events took place in Italy, Central Europe, India, southeast Asia, and on islands off of southeast Asia, like Indonesia. Apparently, people arrived in these areas, corralled some wild boars, and ultimately domesticated them, establishing genetic lines that we have now traced to today.

As usual, molecular biology overrules everything else

The scientists uncovered the pig domestication pattern using modern molecular biology tools. They relied on a genetic tool known as the mitochondrial clock. Mitochondria have their own DNA, which they use as the code for their own, specialized mitochondrial proteins. Because mitochondria are essential to cell function and survival, changes in DNA coding sequences are rare because selection pressures against them are strong. For this reason, any changes are usually random changes in noncoding regions, changes that accumulate slowly and at a fairly predictable rate over time. This rate of accumulation is the mitochondrial clock, which we use to tick off the amount of time that has passed between mutations.

Tick-tock, mitochondrial clock

Very closely related individuals will have almost identical mitochondrial sequences; for example, the mitochondria that you have are probably exactly alike in sequence to the mitochondria your mother has. You inherited those mitochondria only from your mother, whose egg provided these essential organelles to the zygote that ultimately became you. Were someone to sample the mitochondria from one of your relatives thousands of years from now, they would probably find only a few changes, but if they compared this sample to one from someone unrelated to you, they would find different changes and a different number of changes, indicating less of a relationship.

That’s how the researchers figured out the mystery of the pigs. They sampled wild boars from each of the areas and sampled domestic pigs from the same locales. After comparing the mitochondrial DNA sequences among these groups, they found that pigs in Italy had sequences very like those of wild boars in Italy, while pigs in India had sequences very like those of wild boars there.

Approachable teenage-like pigs

How did we domesticate the pigs? Researchers speculate that adult boars (males) who still behaved like teenagers were most likely to approach human settlements to forage. They were also less aggressive than males who behaved like full adults, and thus, easier to domesticate. They fed better on human food scraps than did their more-mature—and more-skittish—brethren, and enjoyed better survival and more opportunities to pass on their juvenile characteristics, which also included shorter snouts, smaller tusks, and squealing, to their offspring. Domestication was just the next step.

Polio vaccine-related polio

Polio virus bits in vaccine rarely join forces with other viruses, become infectious

[Note: some of the links in this piece are to New England Journal of Medicine papers. NEJM does not make its content freely available, so unfortunately, unless you have academic or other access, you’d have to pay per view to read the information. I fervently support a world in which scientific data and information are freely available, but…money is money.]

Worldwide, billions of polio vaccine doses have been administered, stopping a disease scourge that before the vaccine killed people–mostly children–by the thousands in a horrible, suffocating death (see “A brief history of polio and its effects,” below). The polio vaccination campaign has been enormously successful, coming close to the edge of eradicating wild-type polio.

But, as with any huge success, there have been clear negatives. In a few countries–15, to be exact–there have been 14 outbreaks of polio that researchers have traced to the vaccines themselves.  The total number of such cases as of 2009 was 383. The viral pieces in the vaccine–designed to attract an immune response without causing disease–occasionally recombine with other viruses to form an active version of the pathogen. Some kinds of viruses–flu viruses come to mind–can be notoriously tricky and agile that way.

Existing vaccine can prevent vaccine-related polio

Odd as it sounds, the existing vaccines can help prevent the spread of this vaccine-related form of polio. The recombined vaccine-related version tends to break out in populations that are underimmunized against the wild virus, as happened in Nigeria. Nigeria suspended its polio vaccination program in 2003 because rumors began to circulate that the vaccine was an anti-Muslim tactic intended to cause infertility. In 2009, the country experienced an outbreak of vaccine-derived virus, with at least 278 children affected. Experts have found that the existing vaccine can act against either the wild virus or the vaccine-derived form, both of which have equally severe effects. In other words, vaccinated children won’t get either.

Goal is eradication of virus and need for vaccine

Having come so close to total eradication before wild-type-associated cases plateaued between 1000 and 2000 annually in the 21st century, global health officials hold out the hope for two primary goals. They hope to eradicate wild-type polio transmission through a complete vaccination program, which, in turn, will keep vaccine-derived forms from spreading. Once that goal is achieved, they will have reached the final target: no more need for a polio vaccine.

As Dr. Bruce Aylward, Director of the Global Polio Eradication Initiative at WHO, noted: “These new findings suggest that if (vaccine-derived polio viruses) are allowed to circulate for a long enough time, eventually they can regain a similar capacity to spread and paralyse as wild polioviruses. This means that they should be subject to the same outbreak response measures as wild polioviruses. These results also underscore the need to eventually stop all (oral polio vaccine) use in routine immunization programmes after wild polioviruses have been eradicated, to ensure that all children are protected from all possible risks of polio in future.”

If that sounds nutty, it’s been done. Until the early 1970s, the smallpox vaccination was considered a routine vaccination. But smallpox was eradicated, and most people born after the early ’70s have never had to have the vaccine.

A brief history of polio and its effects

I bring you the following history of polio, paraphrased from information I received from a physician friend of mine who works in critical care:

The original polio virus outbreaks occurred before the modern intensive care unit had been invented and before mechanical ventilators were widely available. In 1947-1948, the polio epidemic raged through Europe and the United States, with many thousands of patients dying a horrible death due to respiratory paralysis. Slow asphyxiation is one of the worst ways to die, which is precisely why they simulate suffocation in torture methods such as water boarding. The sensation is unendurable.

In the early twentieth-century polio epidemics, they put breathing tubes down the throats of patients who were asphyxiating due to the respiratory paralysis caused by the polio virus. Because ventilators were unavailable, armies of medical students provided the mechanical respiratory assist to the patients by hand-squeezing a bag which was connected to the breathing tube, over and over and over, 16 times a minute, 24 hours each day, which drove air in and out of the patients’ lungs.  Eventually the iron lung was developed and became widely implemented to manage polio outbreaks. The iron lung subsequently gave way to the modern ventilator, which is another story.

A drug for Fragile X syndrome?

Hopeful news but not peer-reviewed

A new report describes success in a very small trial with a new drug that targets behavioral signs of Fragile X syndrome. This syndrome, which affects about one in every five thousand children, mostly boys, usually involves some form of intellectual disability along with a suite of typical physical characteristics, including large jaws and ears and elongated faces. It is the most common known heritable cause of intellectual disability and has also been associated with autism.

Novartis has been working on an experimental drug targeting some of the behavioral manifestations of Fragile X and has just reported, via interview, positive results from a small trial. Because the results are not public and have not been peer reviewed, the nature of the improvements is unknown, as is the nature of the drug itself. All that is known is that a parameter in the treatment group improved in some, but not all, participants with Fragile X. Also, the drug targets reduction of the synaptic noise that people with Fragile X experience. This reduction in neural background noise, it is thought, may pave the way for more typical neurological development.

Why is the X fragile?

The X chromosome consists of many many genes. Some of these sequences may contain repeats of the same three nucleotides, or letters of the DNA alphabet. For example, a gene section might have 50 repeats of the sequence C-A-G. These trinucleotide repeats, as they are known, are associated with a few well-known disease states when they occur in larger numbers. At a certain low number of repeats, they may have no effect, but when the number of repeats increases, a phenomenon known as trinucleotide expansion, the result can be disease. Huntington’s disease is one well-known disorder associated with trinucleotide expansion, and the general rule is that the more repeats there are, the more severe and/or the earlier the onset of the disorder.

On the X chromosome, where these repeats achieve sufficient numbers to result in Fragile X syndrome, the X chromosome itself looks like it’s literally at a breaking point. This visual fragility is what gave the disorder its name when this chromosome characteristic was discovered in 1969. A parent who carries an X chromosome with relatively few repeats does not have Fragile X, but the gene is in a state known as a premutation. Thanks to various rearrangements and events during cell division, this premutation can expand even in a single generation to sufficient numbers of repeats to cause the disorder in an offspring.

Because the relevant gene is on the X chromosome, Fragile X is an X-linked disorder. It’s more prevalent among males than among females because males receive only one X chromosome. Without the second X chromosome backup that females have, males are stuck with whatever genes–and mutations–are present on the single X chromosome they receive.

What is the autism link?

Fragile X underlies a small percentage of diagnosed cases of autism, between 2 and 6%. Because of the usual genetic complexity underlying autism, Fragile X is also the most common known single-gene cause of autism.

These prematurely reported results have also yielded some speculation that a drug that is effective in reducing background noise and improving behaviors for people with Fragile X might do the same for autistic people, even if their autism isn’t related to Fragile X. With nothing in the way of peer-reviewed findings to consider and results available only via interview, such hopes remain in the purely speculative realm.

For your consideration

Males are born with a single X chromosome. Females have two. The X chromosome has hundreds of genes on it. How is it that women can walk around with a double dose of these genes, or conversely, men can be healthy with a half dose?

Trinucleotide expansion occurs when a trinucleotide repeat sequence expands in numbers of repeats, potentially evolving from a premutation to a full-blown disruption of a gene. What are some possible mechanisms by which this expansion might occur?

In the article related to this report, there is reference to “synaptic noise” and to the idea that a drug might reduce this noise and allow more space for typical development. What do you think “synaptic noise” is, physiologically, and how might a drug target this noise?

%d bloggers like this: