Mitochondrial dysfunction and autism

Beautiful pic of mitochondria

Mitochondria are the powerhouses of the cell, the biology teachers will tell you. These organelles also happen to be likely former bacteria that once were independently living cells, capable of dividing on their own to make new mitochondria. Indeed, they continue to divide by a kind of binary fission as our cells divide, ensuring that a double dose is available for partitioning into the two new cells that result from cell division.

To achieve these feats, mitochondria have their own DNA, their own proteins, and their own protein-making machinery. That means that they also have the potential to undergo genetic mutations that affect the sequence of the proteins their genes encode. Because most of the proteins in mitochondria are mission critical and must function exactly right, the persistence of such mutations is relatively rare. But they do happen, causing disease. One question that has arisen in the study of the causes of autism is whether or not such changes might underlie at least a portion of the cases of this developmental difference.

The high-profile Hannah Poling case

Certainly lending a high profile to this question was the case of Hannah Poling, whose mitochondrial disorder appeared to be linked to her autism symptoms and may have interacted with a bolus of vaccine doses she received, followed by a high fever. Fevers can tax our cellular powerhouses, and if mitochondrial function is already compromised, the high temperatures and extra burden may result in chronic negative outcomes.

Poling’s case brought to the forefront the question of whether or not people with autism might have mitochondrial dysfunction at greater rates. A recent study in the Journal of the American Medical Association (which steadfastly keeps its articles unavailable behind a paywall) has sought to address that question by measuring markers of mitochondrial dysfunction in children with autism and comparing these endpoints with outcomes in children without autism.

Study specifics: “Full-syndrome autism”

The autistic group in the study had what the researchers called “full syndrome autism,” which I take to mean intense symptoms of autism. They used the Autism Diagnostic Inventory-Revised

(ADI-R) and the Autism Diagnostic Observation Schedule (ADOS) to confirm this diagnosis and to ensure as uniform a population among their autistic group as possible. Ultimately, the study included 10 children in this group, recruited consecutively in the clinic based on their fulfillment of the selection criteria. This study was essentially case control, meaning that the control group consisted of 10 non-autistic children, selected to match as closely as possible the demographic characteristics of the autistic group.

The authors report that while only one child among the 10 who were autistic fulfilled the definitive criteria for a mitochondrial respiratory chain disorder, the children with autism were more likely to have indicators of mitochondrial dysfunction.

A problem with pyruvate dehydrogenase (break out your Krebs notes, folks)

Specifically, six out of ten showed lowered levels of activity for one parameter, while eight out of ten showed higher levels than controls for another metabolic endpoint, and two of ten showed higher levels than controls of a third metabolic endpoint. Overall, the results indicated low activity of a mitochondria-specific enzyme, pyruvate dehydrogenase, which is involved in one of the first steps of carbohydrate metabolism that takes place in the mitochondria. Reduced activity of an enzyme anywhere in this process will result in changes in the enzyme’s own products and products further down the pathway and throw off mitochondrial function. Further, half of the autistic group exhibited higher levels of DNA replication, an indicator of cellular stress, more frequently than controls and also had more deletions in their DNA than controls. Statistical analysis suggested that all of these differences were significant.

What does it mean for autism?

Do these findings mean that all or most people with autism have mitochondrial dysfunction? No. The study results do not support that conclusion. Further, the authors themselves list six limitations of the study. These include the possibility that some findings of statistical significance could be in error because of sample size or confounders within the sample and that there were changes in some of the endpoints in the autistic group in both directions. In other words, some autistic children had much higher values than controls, while some had lower values, muddying the meaning of the statistics. The authors note that a study like this one does not allow anyone to draw conclusions about a cause-and-effect association between autism and mitochondria, and they urge caution with regard to generalizing the findings to a larger population.

If there is an association, questions arise from that conclusion. Does mitochondrial dysfunction underlie autism, producing autistic-like symptoms, as some argued in the Hannah Poling case? Or, do autistic manifestations such as anxiety or high stress or some other autism-related factor influence the mitochondria?

Chickens, eggs, MRI, mitochondria, autism

As interesting as both of these recent autism-related studies are, we still have the “Which came first” question to deal with. Did autism cause the brain or mitochondrial differences, or did the brain or mitochondrial differences trigger the autism? Right now, these chicken-and-egg questions may not matter as much as the findings do for helping to identify autism more specifically and addressing some of its negative aspects. Regardless of your stance on neurodiversity or vaccine or acceptance or cure or the in-betweens where most of us fall, it would be difficult to argue that a mitochondrial dysfunction shouldn’t be identified and ameliorated or that an awareness of brain structure differences won’t lead to useful information about what drives autism behaviors.

——————————————–

Note: More lay-accessible versions of this post and the previous post are available at BlogHer.

MRI, brain differences, and autism

MRI: Sagittal view of the brain. Photo courtesy of Wikipedia commons.

You may have read the news reports blaring the finding of an “autism test” that could lead to early and definitive diagnosis of autism. The new evaluation, which has proved worthy of its own name, the Lange-Lainhart test, uses magnetic resonance imaging (MRI) techniques to image brain areas to detect changes associated with autism.

I’ve been unable to find the complete paper, reported to have been published in Autism Research on Nov. 29; the journal has only papers through October available on its Website as of this writing. According to reports, however, the authors state that the new test was 94% accurate in identifying who was autistic and who wasn’t among 60 males tested. The participants in the study were ages  8 to 26; 30 were diagnosed with what the researchers call “high-functioning autism,” and 30 were typically developing.

The imaging technique the authors used involves tracing water diffusion along axons, the long connectors that link neurons to other neurons or tissues. This diffusion tensor imaging process yields an image that can highlight variations in the patterns of these connective pathways in different areas of the brain. This study focused on six brain areas associated with language, social, and emotional functioning, all of which are traditionally considered to be problematic among people with autism.

In the brains of non-autistic participants, the flow patterns were organized in a typical way that indicated connectivity among the brain regions. In the participants diagnosed with high-functioning autism, the flow was disorganized in a pattern common to the autistic group, indicating less connectivity and interaction and thus less exchange of information in the network. The researchers repeated the test on another, smaller set of participants, 12 with autism and 7 without, and produced similar results.

These findings imply that autistic brains may operate like a set of computer hardware components that cannot communicate very well with each other while still functioning perfectly well separately. There may be camera that captures a visual image without trouble or a microphone that captures a voice clearly, but the system lacks the network necessary to integrate the two inputs into a unified perception.

The news reports I’ve read on the study make a big deal out the prospect that this imaging breakthrough could lead to earlier diagnosis of autism, something that most experts believe is key to ameliorating some of its negative manifestations. But experts also urge the standard cautious optimism, and rightfully so.

For one thing, the participants in this study were ages 8 to 26, not within the time frame for early diagnosis of autism, and all of them were male. The study findings can’t tell us whether their brains present with these differences as a result of developing with autism, or whether they have autism because their brains are built this way. Before there can be talk of “early diagnosis” and linking these changes to manifestations of autism, we’d need studies showing these differences in much younger children. Further, given the frequent findings of differences between males and females on the spectrum, investigations involving autistic girls and women are necessary.

This study is not the first to use imaging to identify distinctions between autistic and non-autistic people. Other studies have also done so, finding pattern variations in the neuronal tracts of children with autism compared to children without it, in critical areas relevant to the clinical symptoms of autism.

While I find these results intriguing, I note one thing that no one seems to have commented on. In the reports I’ve read about this study, the researchers observe that currently, the only way to diagnose autism is based on a symptom checklist, questionnaires, screenings, and so on—any autism parents reading this will know that drill—and the ultimate call relies on the expertise of the medical professional conducting the evaluations. The implication of these comments is that we need some better, more unequivocal, less-subjective methods of identifying autistic people.

Yet, presumably the autistic boys and men they used for this study were diagnosed using just such subjective evaluation, and their autism diagnoses appear to have been confirmed in 94% of cases by similarities of MRI findings. In my mind, this outcome suggests that the process of subjective evaluation seems to be working pretty well. Of course, we’re a visual species and like our decisions to be given literally in black and white. Such MRI results may fulfill a need that has less to do with correct outcomes than it does with a dose of visual confirmation–and satisfaction.

Did humans and their fire kill off Australia’s megafauna?

Genyornis. Courtesy of Michael Ströck & Wikimedia Commons.

Timeline, 2005: For those of us who do not live in Australia (and live instead in, say, boring old Texas), the animals that live on that continent can seem like some of the most exotic species in the world. The kangaroo, wombat, and Tasmanian devil, and most of all, the platypus, are high on the list of the unusual and bizarre in the animal kingdom.

But modern-day Australia has nothing on the Australia of 50,000 years ago when humans first arrived from Java. They encountered huge kangaroos, marsupial lions, 25-foot lizards, and tortoises the size of a subcompact car. Yet, within 5000 years, many of these animals had disappeared permanently. And since the dawn of the study of paleontology, researchers have wondered why.

Of course, it’s our fault

Of course, humans feature as the culprits in most scenarios. Just as the first people in the Americas are usually blamed at least in part for the disappearance of the American megafauna, like mammoths or giant sloths, the first people in Australia have also been suspected of hunting these animals to extinction or exposing them to diseases that decimated the populations.

As it turns out, humans may be to blame, but not through direct destruction or disease transmission. Instead, it may be the mastery of fire, the turning point in our cultural history, that ended in the extinction of many species larger than 100 pounds on the Australian continent.

Fire!

Australia’s first people probably set huge fires to signal to one another, flush animals for hunting, clear paths through what was once a mosaic of trees, shrubs, and grasses, or to encourage the growth of specific plants. The byproduct of all of this burning was catastrophic to the larger species on the continent.

The fires, according to one study, wiped out the drought-adapted plants that covered the continent’s interior, leaving behind a desert of scrub brush. The change in plant cover may have resulted in a decrease in water vapor exchange between the earth and the atmosphere with the ultimate effect of ending the yearly monsoon rains that would quench the region. Without the rains, only the hardiest, desert-ready plants survived.

You are what you eat…or ate

How could researchers possibly have elucidated these events of 45,000 years ago? By looking at fossilized bird eggs and wombat teeth. Using isotopic techniques, they assessed the types of carbon present in the bird eggs and teeth that dated back from 150,000 to 45,000 years ago. These animals genuinely were what they ate in some ways, with some isotopic markers of their diet accumulating in these tissues. Because plants metabolize different forms of carbon in different ways, the researchers could link the type of carbon isotopes they found in the egg and teeth fossils to the diet of these animals.

They found that the diet of a now-extinct species of bird, the Genyornis, consisted of the nutritious grasses of the pre-human Australian landscape. Emu eggs from before 50,000 years ago pointed to a similar diet, but eggs from 45,000 years ago indicated a shift in emu diet from nutritious grasses to the desert trees and shrubs of the current Australian interior. The vegetarian wombats also appear to have made a similar change in their diets around the same time.

Or, maybe not

And the species that are still here today, like the emu and the wombat, are the species that were general enough in their dietary needs to make the shift. The Genyornis went the way of the mammoth, possibly because its needs were too specialized for it to shift easily to a different diet. Its teeth showed no change in diet over the time period.

The researchers analyzed 1500 fossilized eggshell specimens from Genyornis and emu to solve this mystery and to pinpoint human burning practices as the culprits in the disappearance of these megafauna in a few thousand brief years. Today’s aboriginal Australians still use burning in following traditional practices, but by this time, the ecosystems have had thousands of years to adapt to burns. Thus, we don’t expect to see further dramatic disappearances of Australian fauna as a result of these practices. Indeed, some later researchers have taken issue with the idea that fire drove these changes in the first place, with some blaming hunting again, and as with many things paleontological, the precise facts of the situation remain…lost in the smoky haze of deep history.

Roll over eggs…it’s time for (unrolled) tobacco leaves

Tobacco leaf infected with Tobacco Mosaic Virus. Courtesy of Clemson University - USDA Cooperative Extension Slide Series

Timeline, 2008: If you’ve ever been asked about allergy to egg products before receiving a flu vaccine, you have had a little encounter with the facts of vaccine making. Flu viruses to produce the vaccine are painstakingly grown in chicken eggs because eggs make perfect little incubators for the bugs.

So…many…eggs

There are problems—in addition to the allergy issue—that arise with this approach. First of all, growing viruses for a million vaccine doses usually means using a million fertilized, 11-day-old eggs. For the entire population of the United States, 300 million eggs would be required. Second, the process requires months of preparation, meaning a slow turnaround time for vaccines against a fast-moving, fast-changing disease. Last, if there is anything wrong with the eggs themselves, such as contamination, the whole process is a waste and crucial vaccines are lost.

The day may come when we can forget about eggs and turn to leaves. Plants can contract viral disease just like animals do. In fact, an oft-used virus in some research fields is the tobacco mosaic virus, which, as its name implies, infects tobacco plants. It gives a patchy look to the leaves of infected plants, and researchers use this feature to determine whether the virus has taken hold.

Bitter little avatars of evil used for good?

Tobacco plants themselves, bitter little avatars of evil for their role in the health-related effects of smoking, serve a useful purpose in genetic research and have now enhanced their approval ratings for their potential in vaccine production. Plants have caught the eye of vaccine researchers for quite a while because they’re cheaper and easier to work with than animal incubators. Using plants for quick-turnaround vaccine production has been a goal, but a few problems have hindered progress.

To use a plant to make a protein to make a vaccine, researchers must first get the gene for the protein into the plant. Previous techniques involved tedious and time-consuming processes for inserting the gene into the plant genome. Then, clock ticking, there was the wait for the plant to grow and make the protein. Add in the Byzantine process of obtaining federal approval to use a genetically modified plant, and you’ve got the opposite of “rapid” on your hands.

One solution to this problem would simply be to get the gene into the plant cell cytoplasm for immediate use. It’s possible but involves meticulously injecting a solution with the gene sequence into each leaf. Once the gene solution is in, the plant will transcribe it—copy it into mRNA—in the cell cytoplasm and then build the desired protein based on the mRNA code. But there has been no way to take hand injection to the large-scale production of proteins, including for vaccines.

Age-old vacuum suction =  high-tech high-throughput

To solve this problem, researchers turned to one of our oldest technologies: vacuum suction. They grew tobacco plants to maturity and then clipped off the leaves, which they submerged in a solution. The solution was spiked with a nasty bug, Agrobacterium tumefaciens, a pathogen responsible for the growth of galls, or tumors, on plants. Anyone working in agriculture fears this bacterium, a known destroyer of grapes, pitted fruit trees, and nut trees. But it does have one useful feature for this kind of work: It can insert bits of its DNA into plant cells. The researchers tricked A. tumefaciens into inserting another bit of DNA instead, the code for the protein they wanted to make.

To get the solution close to the cells, the investigators had to get past air bubbles, and that’s where the vacuum came in. They placed the submerged leaves into a vacuum chamber and flipped a switch, and the activated chamber sucked all the air out of the leaves. When the vacuum was turned off, the solution flowed into the now-empty chambers of the leaf, allowing the A. tumefaciens-spiked solution to bathe the plant cells. After 4 days and a few basic protein-extraction steps, the research team had its protein batch. According to the team lead, “any protein” could be made using this process, opening up almost unlimited possibilities and applications for this approach.

Vaccines…or combating bioterrorism?

The technology has come far enough that a US company has taken steps toward manufacturing vaccines using tobacco leaves.  And it appears that the applications go beyond vaccines, as one news story has noted…the tobacco plants might also be used to produce antidotes to common agents of bioterrorism.

Complex amphibian responses to past climate change

Eastern tiger salamander: Ambystoma tigrinum, courtesy of Wikimedia Commons

We were like gophers, but now we’re like voles

Timeline, 2005: There is a cave in Yellowstone packed with fossils from the late Holocene, from about 3000 years ago. We can glean from this trove of stony bone how different taxa respond to climate change at the morphological and genetic levels and define and make predictions about the current response of the world to such changes.

The cave, which is in a sensitive area of the park off limits to visitors, houses the fossilized bones of rodents, wolves, amphibians, bears, coyotes, beavers, and elk, among others. This fossil cornucopia has yielded so much in the way of stony evidence that sorting it all is in itself a mammoth task. But two climatic stories have emerged from the samples it has yielded.

A global warming story…from the Middle Ages

The first story is about salamanders and climate change. No, it’s not a 21st-century story about global warming, but a Middle Ages story about a hotter planet. From about 1150 to 650 years ago, the earth underwent a brief warming period known as the Medieval Warming Period. During this time, the sea surface temperature was about a degree warmer and overall, the planet was much drier. This climatic anomaly was followed by what many climatologists call the Little Ice Age, a period that ended around 1900.

During the warm and dry period, animals in what would become Yellowstone National Park responded in ways that left clues about how animals may respond today to our warming planet. Amphibians make particularly sensitive sentinels of environmental change, alerting us to the presence of pollutants or other alterations that affect them before larger manifestations are detectable. And they even provide us evidence in their fossils.

Hot times, smaller paedomorphic salamanders

A group from Stanford excavated the fossils of Ambystoma tigrinum (the tiger salamander) from 15 layers at the Yellowstone site and divided them into five time periods based on their estimated age. They then divided the fossils again based on whether they represented the tiger salamander in its larval, paedomorphic, early adult, or later adult stages. The tiger salamander exhibits paedomorphism, in which the animal achieves reproductive capacity or adulthood while still retaining juvenile characteristics. In the case of the tiger salamander, this translates into remaining in the water, rather than becoming a terrestrial adult, and into retaining characteristics like frilly gills. The molecular determinant of whether or not an amphibian undergoes complete metamorphosis from juvenile to adult is thyroid hormone; when levels of this internal signal are low, the animal will remain juvenile.

The researchers found that during the medieval warming period, the paedomorphic salamanders became smaller than they were during cooler times. This outcome would be expected because when water is cooler, thyroid hormone levels will be lower, and the animal will continue growing as a juvenile.

Hot times, larger adult salamanders

On the other hand, the terrestrial adult salamanders were much larger during the warm period than during cooler periods. Again, this outcome would be expected because the heat on land would encourage faster metabolism, which would result in faster growth. The researchers found no difference in actual numbers between groups at cool vs. warm periods, but express concern that drying in Yellowstone today as a result of global warming might reduce the number of aquatic paedomorphs, affecting aquatic food webs.

From amphibians to gopher teeth

The same group also studied DNA from fossilized teeth of gophers and voles discovered in the cave. They found that during the dry period, gophers, who were stuck underground and isolated, experienced genetic bottlenecking, a reduction in diversity that persists today. However, the mobile, above-ground voles sought mates far and wide during the dry, warm period and actually experienced an increase in diversity. The lead researcher in the group compares early groups of isolated humans to the gophers, saying that they would have experienced a loss of diversity. But today’s population, with our ability to travel the globe with ease, is probably undergoing an increase in diversity since we’re able to mate with people a hemisphere away.

The cooperative, semi-suicidal blob lurking beneath the cow pasture

Aggregating Dictyostelium discoideum; photo from a series available at Wikimedia Commons

Amoebae gone to pasture

Timeline, 2009: Lurking beneath the soil and cow patties of a southeast Texas cow pasture, the cellular slime mold cloned itself and cloned itself again until it was about 40 feet across, billions of cells dense. The thing is, this enormous creature was really neither a slime nor a mold, but a group of social microbes, amoebas that were all exactly identical to one another.

It’s a cell, it’s a colony, it’s…a multicellular organism!

No one is quite sure why some microbes are social. But the cellular slime mold, Dictyostelium discoideum, makes a great model for figuring out why. This organism, a eukaryote, can start out life as a single cell, one that hangs out waiting for a hapless bacterium to wander by so it can have dinner. But if bacteria are in short supply or the soil dries up, slime molds can do some marvelous things. They can start socializing with each other, forming colonies—apparently quite large colonies. But they also can shift from being single-celled organisms or single-celled colonial organisms to becoming a multicellular organism. In the really bad times, these cells can signal to one another to differentiate into two different tissues, officially becoming multicellular.

In addition, these brainless blobs also exhibit altruism, a kind known as suicidal altruism in which some cells in a colony commit cell suicide so that cells elsewhere in the colony can thrive. And as the discovery in the Texas cow pasture shows, the organisms can exist in enormous clonal colonies, a quivering gelatinous mass just under the soil, waiting for conditions to improve.

An affinity for dung

How did the researchers even find this slime mold? They had to get themselves into the “mind” of the slime mold and figure out where this species might be most prone to going clonal, a response to an uninviting environment. Given that slime molds are fond of deep, humid forests, the researchers opted to search for the clonal variety in a dry-ish, open pasture at the edge of such forests. Why would they even expect D. discoideum to be there? Because the amoebae also happen to have an affinity for dung, as well as soil.

After careful sampling from 18 local fields, which involved inserting straws into cow patties and surrounding soil, the research team tested their samples for the presence of amoebae. Once the dishes showed evidence of the slime mold, the investigators then sampled each group and sequenced the genome. That’s when they realized that 40 feet of one cow pasture was one solid mass of amoebae clones.

The winner clones it all?

These findings raise intriguing questions for microbial ecologists. One problem they’re tackling is what determines which clone gets to reproduce so wildly at the edges of acceptable amoeba territory. Is it just one clone that arrives and gets to work, or do several show up, have a competition, and the winner clones all? They tested this idea in the lab and found that none of the clones they tested seemed to have any particular competitive advantage, so the selection process for which amoeba gets to clone at the edges remains a mystery.

Another question that intrigues the ecologists is why these single-celled organisms cooperate with one another in the first place. Such clonal existence is not unknown in nature—aspen trees do it, and so do sea anemones. And researchers have identified as many as 100 genes that seem to be associated with cooperative behavior in the slime mold. One explanation for cooperating even to the extreme of individual suicide is the clonal nature of the colony: the more identical the genes, the more the species derives from sacrificing for other members of the species. They ensure that the same genes survive, just in a different individual.

Machiavellian amoebae: cheaters among the altruists

Not all amoebae are altruistic, and some apparently can be quite Machiavellian. Some individual amoeba carry cooperation genes that have undergone mutations, leading them to be the cheaters among the altruists and take advantage of the altruistic suicide to further their own survival. What remains unclear is why cooperation continues to be the advantageous route, while the mutant cheats keep getting winnowed out.

Amoebae gone to pasture

By Emily Willingham

Word count: 672

 

Lurking beneath the soil and cow patties of a southeast Texas cow pasture, the cellular slime mold cloned itself and cloned itself again until it was about 40 feet across, billions of cells dense. The thing is, this enormous creature was really neither a slime nor a mold, but a group of social microbes, amoebas that were all exactly identical to one another.

No one is quite sure why some microbes are social. But the cellular slime mold, Dictyostelium discoideum, makes a great model for figuring out why. This organism, a eukaryote, can start out life as a single cell, one that hangs out waiting for a hapless bacterium to wander by so it can have dinner. But if bacteria are in short supply or the soil dries up, slime molds can do some marvelous things. They can start socializing with each other, forming colonies—apparently quite large colonies. But they also can shift from being single-celled organisms or single-celled colonial organisms to becoming a multicellular organism. In the really bad times, these cells can signal to one another to differentiate into two different tissues, officially becoming multicellular.

In addition, these brainless blobs also exhibit altruism, a kind known as suicidal altruism in which some cells in a colony commit cell suicide so that cells elsewhere in the colony can thrive. And as the discovery in the Texas cow pasture shows, the organisms can exist in enormous clonal colonies, a quivering gelatinous mass just under the soil, waiting for conditions to improve.

How did the researchers even find this slime mold? They had to get themselves into the “mind” of the slime mold and figure out where this species might be most prone to going clonal, a response to an uninviting environment. Given that slime molds are fond of deep, humid forests, the researchers opted to search for the clonal variety in a dry-ish, open pasture at the edge of such forests. Why would they even expect D. discoideum to be there? Because the amoebae also happen to have an affinity for dung, as well as soil.

After careful sampling from 18 local fields, which involved inserting straws into cow patties and surrounding soil, the research team tested their samples for the presence of amoebae. Once the dishes showed evidence of the slime mold, the investigators then sampled each group and sequenced the genome. That’s when they realized that 40 feet of one cow pasture was one solid mass of amoebae clones.

These findings raise intriguing questions for microbial ecologists. One problem they’re tackling is what determines which clone gets to reproduce so wildly at the edges of acceptable amoeba territory. Is it just one clone that arrives and gets to work, or do several show up, have a competition, and the winner clones all? They tested this idea in the lab and found that none of the clones they tested seemed to have any particular competitive advantage, so the selection process for which amoeba gets to clone at the edges remains a mystery.

Another question that intrigues the ecologists is why these single-celled organisms cooperate with one another in the first place. Such clonal existence is not unknown in nature—aspen trees do it, and so do sea anemones. And researchers have identified as many as 100 genes that seem to be associated with cooperative behavior in the slime mold. One explanation for cooperating even to the extreme of individual suicide is the clonal nature of the colony: the more identical the genes, the more the species derives from sacrificing for other members of the species. They ensure that the same genes survive, just in a different individual.

Not all amoebae are altruistic, and some apparently can be quite Machiavellian. Some individual amoeba carry cooperation genes that have undergone mutations, leading them to be the cheaters among the altruists and take advantage of the altruistic suicide to further their own survival. What remains unclear is why cooperation continues to be the advantageous route, while the mutant cheats keep getting winnowed out.

Microbes redirect our best-laid plans

Greeks at War (pottery from the British Museum; photo courtesy of Wikimedia Commons)

The madness of King George

Timeline, 2006: It’s not that unusual for disease to alter the course of history–or, history as humans intended it to be. Some scholars believe, for example, that the intransigence of King George III of England arose from his affliction with porphyria, a heritable metabolic disorder that can manifest as a mental problem, with symptoms that include irrational intractability. But it’s rarer for a disease to shift the balance of power so entirely that one nation gains the upper hand over another. Yet that appears to be what happened to the city-state of Athens just before the Peloponnesian Wars of around 430 B.C. The upshot of the wave of disease and the wars was that Sparta conquered Athens in 404 B.C.

Spartans had a little microbial assistance

Sparta may have owed its big win to a small bacterium, Salmonella enterica enterica serovar Typhi, the microbe responsible for typhoid fever. A plague swept across Athens from 430 to 426 B.C., having traveled from Ethiopia to Egypt and Libya before alighting in the Greek city and destroying up to a third of its population. In addition, it brought to a close what became known as the Golden Age of Pericles, a time when Athens produced some of its most amazing and widely recognized art, artists, and philosophers, including Aeschylus, Socrates, the Parthenon, and Sophocles. Pericles was a statesman who oversaw the rebuilding of Athens following the Greek win in the Persian Wars, and he guided the city-state to a more democratic form of rule and away from the dictatorships of the previous regimes. In the process, the city flourished in art and architecture.

And then along came the plague. The Greek historian Thucydides, who chronicled the Peloponnesian Wars, left behind such a detailed account of the plague, its symptoms, and what happened to its victims, that intrigued medical detectives have ever since debated about what might have caused it. Thucydides, himself a plague survivor, vividly described the sudden fever, redness of the eyes, hemorrhaging, painful chest and cough, stomach distress, and diarrhea that ultimately led to death in so many cases. He also mentioned pustules and ulcers of the skin and the loss of toes and fingers in survivors. This litany of symptoms produced many candidate causes, including bubonic plague, anthrax, smallpox, measles, and typhoid fever.

Construction crews yet again uncover something interesting

In the mid-1980s, a construction crew was busy digging a hole for a subway station in the city of Kerameikos when they uncovered a mass burial site. Unlike other Greek burial sites, this one bore marks of hasty and haphazard burials, and the few artefacts that accompanied the bones dated it to the time of the plague that destroyed Athens. Researchers were able to harvest some teeth from the site and analyze the tooth pulp, which retains a history of the infections a person has suffered. They examined DNA sequences from the pulp for matches with suggested microbial agents of the plague, and finally found a match with the typhoid bacterium.

Typhoid Mary: Intent on cooking, ended up killing

One discrepancy between the disease pattern of typhoid fever and that described by Thucydides is the rapidity of onset the Greek historian detailed. Today, typhoid fever, which still infects millions of people worldwide, takes longer to develop in an infected individual, and sometimes never develops at all. People who bear the virus but don’t become ill themselves are “carriers.”

Perhaps the most famous carrier was Typhoid Mary, Mary Mallon, a cook in New York City at the beginning of the 20th century. It is believed that she infected hundreds of people, with about 50 known cases and a handful of deaths being directly associated with her. Typhoid Mary was told not to work as a cook any longer or she would be quarantined, but she simply disappeared for awhile and then turned up under a different name, still working as a cook. After another outbreak was traced to her, she was kept in quarantine for 23 years until she died.

%d bloggers like this: