Mitochondrial dysfunction and autism

Beautiful pic of mitochondria

Mitochondria are the powerhouses of the cell, the biology teachers will tell you. These organelles also happen to be likely former bacteria that once were independently living cells, capable of dividing on their own to make new mitochondria. Indeed, they continue to divide by a kind of binary fission as our cells divide, ensuring that a double dose is available for partitioning into the two new cells that result from cell division.

To achieve these feats, mitochondria have their own DNA, their own proteins, and their own protein-making machinery. That means that they also have the potential to undergo genetic mutations that affect the sequence of the proteins their genes encode. Because most of the proteins in mitochondria are mission critical and must function exactly right, the persistence of such mutations is relatively rare. But they do happen, causing disease. One question that has arisen in the study of the causes of autism is whether or not such changes might underlie at least a portion of the cases of this developmental difference.

The high-profile Hannah Poling case

Certainly lending a high profile to this question was the case of Hannah Poling, whose mitochondrial disorder appeared to be linked to her autism symptoms and may have interacted with a bolus of vaccine doses she received, followed by a high fever. Fevers can tax our cellular powerhouses, and if mitochondrial function is already compromised, the high temperatures and extra burden may result in chronic negative outcomes.

Poling’s case brought to the forefront the question of whether or not people with autism might have mitochondrial dysfunction at greater rates. A recent study in the Journal of the American Medical Association (which steadfastly keeps its articles unavailable behind a paywall) has sought to address that question by measuring markers of mitochondrial dysfunction in children with autism and comparing these endpoints with outcomes in children without autism.

Study specifics: “Full-syndrome autism”

The autistic group in the study had what the researchers called “full syndrome autism,” which I take to mean intense symptoms of autism. They used the Autism Diagnostic Inventory-Revised

(ADI-R) and the Autism Diagnostic Observation Schedule (ADOS) to confirm this diagnosis and to ensure as uniform a population among their autistic group as possible. Ultimately, the study included 10 children in this group, recruited consecutively in the clinic based on their fulfillment of the selection criteria. This study was essentially case control, meaning that the control group consisted of 10 non-autistic children, selected to match as closely as possible the demographic characteristics of the autistic group.

The authors report that while only one child among the 10 who were autistic fulfilled the definitive criteria for a mitochondrial respiratory chain disorder, the children with autism were more likely to have indicators of mitochondrial dysfunction.

A problem with pyruvate dehydrogenase (break out your Krebs notes, folks)

Specifically, six out of ten showed lowered levels of activity for one parameter, while eight out of ten showed higher levels than controls for another metabolic endpoint, and two of ten showed higher levels than controls of a third metabolic endpoint. Overall, the results indicated low activity of a mitochondria-specific enzyme, pyruvate dehydrogenase, which is involved in one of the first steps of carbohydrate metabolism that takes place in the mitochondria. Reduced activity of an enzyme anywhere in this process will result in changes in the enzyme’s own products and products further down the pathway and throw off mitochondrial function. Further, half of the autistic group exhibited higher levels of DNA replication, an indicator of cellular stress, more frequently than controls and also had more deletions in their DNA than controls. Statistical analysis suggested that all of these differences were significant.

What does it mean for autism?

Do these findings mean that all or most people with autism have mitochondrial dysfunction? No. The study results do not support that conclusion. Further, the authors themselves list six limitations of the study. These include the possibility that some findings of statistical significance could be in error because of sample size or confounders within the sample and that there were changes in some of the endpoints in the autistic group in both directions. In other words, some autistic children had much higher values than controls, while some had lower values, muddying the meaning of the statistics. The authors note that a study like this one does not allow anyone to draw conclusions about a cause-and-effect association between autism and mitochondria, and they urge caution with regard to generalizing the findings to a larger population.

If there is an association, questions arise from that conclusion. Does mitochondrial dysfunction underlie autism, producing autistic-like symptoms, as some argued in the Hannah Poling case? Or, do autistic manifestations such as anxiety or high stress or some other autism-related factor influence the mitochondria?

Chickens, eggs, MRI, mitochondria, autism

As interesting as both of these recent autism-related studies are, we still have the “Which came first” question to deal with. Did autism cause the brain or mitochondrial differences, or did the brain or mitochondrial differences trigger the autism? Right now, these chicken-and-egg questions may not matter as much as the findings do for helping to identify autism more specifically and addressing some of its negative aspects. Regardless of your stance on neurodiversity or vaccine or acceptance or cure or the in-betweens where most of us fall, it would be difficult to argue that a mitochondrial dysfunction shouldn’t be identified and ameliorated or that an awareness of brain structure differences won’t lead to useful information about what drives autism behaviors.

——————————————–

Note: More lay-accessible versions of this post and the previous post are available at BlogHer.

Roll over eggs…it’s time for (unrolled) tobacco leaves

Tobacco leaf infected with Tobacco Mosaic Virus. Courtesy of Clemson University - USDA Cooperative Extension Slide Series

Timeline, 2008: If you’ve ever been asked about allergy to egg products before receiving a flu vaccine, you have had a little encounter with the facts of vaccine making. Flu viruses to produce the vaccine are painstakingly grown in chicken eggs because eggs make perfect little incubators for the bugs.

So…many…eggs

There are problems—in addition to the allergy issue—that arise with this approach. First of all, growing viruses for a million vaccine doses usually means using a million fertilized, 11-day-old eggs. For the entire population of the United States, 300 million eggs would be required. Second, the process requires months of preparation, meaning a slow turnaround time for vaccines against a fast-moving, fast-changing disease. Last, if there is anything wrong with the eggs themselves, such as contamination, the whole process is a waste and crucial vaccines are lost.

The day may come when we can forget about eggs and turn to leaves. Plants can contract viral disease just like animals do. In fact, an oft-used virus in some research fields is the tobacco mosaic virus, which, as its name implies, infects tobacco plants. It gives a patchy look to the leaves of infected plants, and researchers use this feature to determine whether the virus has taken hold.

Bitter little avatars of evil used for good?

Tobacco plants themselves, bitter little avatars of evil for their role in the health-related effects of smoking, serve a useful purpose in genetic research and have now enhanced their approval ratings for their potential in vaccine production. Plants have caught the eye of vaccine researchers for quite a while because they’re cheaper and easier to work with than animal incubators. Using plants for quick-turnaround vaccine production has been a goal, but a few problems have hindered progress.

To use a plant to make a protein to make a vaccine, researchers must first get the gene for the protein into the plant. Previous techniques involved tedious and time-consuming processes for inserting the gene into the plant genome. Then, clock ticking, there was the wait for the plant to grow and make the protein. Add in the Byzantine process of obtaining federal approval to use a genetically modified plant, and you’ve got the opposite of “rapid” on your hands.

One solution to this problem would simply be to get the gene into the plant cell cytoplasm for immediate use. It’s possible but involves meticulously injecting a solution with the gene sequence into each leaf. Once the gene solution is in, the plant will transcribe it—copy it into mRNA—in the cell cytoplasm and then build the desired protein based on the mRNA code. But there has been no way to take hand injection to the large-scale production of proteins, including for vaccines.

Age-old vacuum suction =  high-tech high-throughput

To solve this problem, researchers turned to one of our oldest technologies: vacuum suction. They grew tobacco plants to maturity and then clipped off the leaves, which they submerged in a solution. The solution was spiked with a nasty bug, Agrobacterium tumefaciens, a pathogen responsible for the growth of galls, or tumors, on plants. Anyone working in agriculture fears this bacterium, a known destroyer of grapes, pitted fruit trees, and nut trees. But it does have one useful feature for this kind of work: It can insert bits of its DNA into plant cells. The researchers tricked A. tumefaciens into inserting another bit of DNA instead, the code for the protein they wanted to make.

To get the solution close to the cells, the investigators had to get past air bubbles, and that’s where the vacuum came in. They placed the submerged leaves into a vacuum chamber and flipped a switch, and the activated chamber sucked all the air out of the leaves. When the vacuum was turned off, the solution flowed into the now-empty chambers of the leaf, allowing the A. tumefaciens-spiked solution to bathe the plant cells. After 4 days and a few basic protein-extraction steps, the research team had its protein batch. According to the team lead, “any protein” could be made using this process, opening up almost unlimited possibilities and applications for this approach.

Vaccines…or combating bioterrorism?

The technology has come far enough that a US company has taken steps toward manufacturing vaccines using tobacco leaves.  And it appears that the applications go beyond vaccines, as one news story has noted…the tobacco plants might also be used to produce antidotes to common agents of bioterrorism.

Your mother *is* always with you

Mother and child, microchimeras

When you’re in utero, you’re protected from the outside world, connected to it only via the placenta, which is supposed to keep you and your mother separated. Separation is generally a good thing because you are foreign to your mother, and she is foreign to you. In spite of the generally good defenses, however, a little bit of you and a little bit of her cross the barrier. Scientists have recently found that when that happens, you often end up toting a bit of mom around for decades, maybe for life.

The presence of cells from someone else in another individual is called microchimerism. A chimera in mythology was a beast consisting of the parts of many animals, including lion, goat, and snake. In genetics, a chimera carries the genes of some other individual along with its own, perhaps even the genes of another species. In microchimerism, we carry a few cells from someone else around with us. Most women who have been pregnant have not only their own cells but some cells from their offspring, as well. I’m probably carrying around cells from each of my children.

Risks and benefits of sharing

Microchimerism can be useful but also carries risks. Researchers have identified maternal cells in the hearts of infants who died from infantile lupus and determined that the babies had died from heart block, partially from these maternal cells that had differentiated into excess heart muscle. On the other hand, in children with type 1 diabetes, maternal cells found in the pancreatic islets appear to be responding to damage and working to fix it.

The same good/bad outcomes exist for mothers who carry cells from their children. There has long been an association between past pregnancy and a reduced risk of breast cancer, but why has been unclear. Researchers studying microchimerism in women who had been pregnant found that those without breast cancer had fetal microchimerism at a rate three times that of women who with the cancer.

Microchimerism and autoimmunity

Autoimmune diseases develop when the body attacks itself, and several researchers have turned to microchimerism as one mechanism for this process. One fact that led them to investigate fetal microchimerism is the heavily female bias in autoimmune illness, suggesting a female-based event, like pregnancy. On the one hand, pregnancy appears to reduce the effects of rheumatoid arthritis, an autoimmune disorder affecting the joints and connective tissues. On the other hand, women who have been pregnant are more likely to develop an autoimmune disorder of the skin and organs called scleroderma (“hard skin”) that involves excess collagen deposition. There is also a suspected association between microchimerism and pre-eclampsia, a condition in pregnancy that can lead to dangerously high blood pressure and other complications that threaten the lives of mother and baby.

Human leukocyte antigen (HLA)

The autoimmune response may be based on a similarity between mother and child of HLA, immune-related proteins encoded on chromosome 6. This similarity may play a role in the immune imbalances that lead to autoimmune diseases; possibly because the HLAs of the mother and child are so similar, the body clicks out of balance with a possible HLA excess. If they were more different, the mother’s immune system might simply attack and destroy fetal HLAs, but with the strong similarity, fetal HLAs may be like an unexpected guest that behaves like one of the family.

Understanding the links between microchimerism and disease is the initial step in exploiting that knowledge for therapies or preventative approaches. Researchers have already used this information to predict the development of a complication in stem cell transplant called “graft-versus-host disease” (GVH). In stem cell transplants, female donors with previous pregnancies are more associated with development of GVH because they are microchimeric. Researchers have exploited this fact to try to predict whether or not there will be an early rejection of a transplant in kidney and pancreas organ transplants.

(Photo courtesy of Wikimedia Commons and photographer Ferdinand Reus).

Think the eye defies evolutionary theory? Think again

The compound lens of the insect eye

Win for Darwin

When Darwin proposed his theory of evolution by natural selection, he recognized at the time that the eye might be a problem. In fact, he even said it was “absurd” to think that the complex human eye could have evolved as a result of random mutations and natural selection. Although evolution remains a fact, and natural selection remains a theory, the human eye now has some solid evolutionary precedence. A group of scientists that has established a primitive marine worm, Platynereis dumerilii, as a developmental biology model has found that it provides the key to the evolution of the human—and insect—eye.

Multiple events in eye evolution, or only one?

The divide over the eye occurred because the insects have the familiar compound-lens—think how fly eyesight is depicted—and vertebrates have a single lens. Additionally, insects use rhabdomeric photoreceptors, and vertebrates have a type known as ciliary receptors. The rhabdomeric receptors increase surface area in the manner of our small intestine—by having finger-like extensions of the cell. The ciliary cells have a hairy appearance because of cilia that pop outward from the cell. A burning question in evolutionary biology was how these two very different kinds of eyes with different types of photoreceptors evolved. Were there multiple events of eye evolution, or just one?

Just once?

P. dumerilii work indicates a single evolutionary event, although the usual scientific caveats in the absence of an eyewitness still apply. This little polychaete worm, a living fossil, hasn’t changed in about 600 million years, and part of its prototypical insect brain responds to light. In this system is a complex of cells that forms three pairs of eyes and has two types of photoreceptor cells. Yep, those two types are the ciliary and the rhabdomeric. This little marine worm has both kinds of receptors, using the rhabdomeric receptors in its little eyes and the ciliary receptors in its brain. Researchers speculate that the light receptors in the brain serve to regulate the animal’s circadian rhythm.

How could the existence of these two types of receptors simultaneously lead to the evolution of two very different kinds of eyes? An ancestral form could have had duplicate copies of one or both genes present. Ultimately, if the second copy of the rhabdomeric receptor gene were recruited to an eye-like structure, evolution continued down the insect path. But, if the second copy of a ciliary cell’s photoreceiving gene were co-opted for another function, and the cells were ultimately recruited from the brain for use in the eye, then evolution marched in the vertebrate direction.

All of the above is completely speculation, although this worm’s light-sensitive molecule, or opsin, is very much like the opsin our own rods and cones make, and the molecular biology strongly indicates a relationship. It doesn’t completely rule out multiple eye-evolution events, but it certainly provides some nice evidence for a common eye ancestor for insects and vertebrates.

Note: This work appeared in 2004 and got a detailed writeup at Pharyngula.

Is the tree of life really a ring?

A proposed ring of life

The tree of life is really a ring

When Darwin proposed his ideas about how new species arise, he produced a metaphor that we still adhere to today to explain the branching patterns of speciation: The Tree of Life. This metaphor for the way one species may branch from another through changes in allele frequencies over time is so powerful and of such long standing that many large studies of the speciation process and of life’s origins carry its name.

It may be time for a name change. In 2004, an astrobiologist and molecular biologist from UCLA found that a ring metaphor may better describe the advent of earliest eukaryotes. Astrobiologists study the origins of life on our planet because of the potential links between these earthly findings and life on other planets. Molecular biologists can be involved in studying the evolutionary patterns and relationships that our molecules—such as DNA or proteins—reveal. Molecular biologist James Lake and astrobiologist Mary Rivera of UCLA teamed up to examine how genomic studies might reveal some clues about the origins of eukaryotes on Earth.

Vertical transfer is so 20th century

We’ve heard of the tree of life, in which one organism begets another, passing on its genes in a vertical fashion, with this vertical transfer of genes producing a tree, with each new production becoming a new branch. The method of gene transfer that would produce a genuine circle, or ring, is horizontal transfer, in which two organisms fuse genomes to produce a new organism. The ends of the branches in this scenario fuse together via their genomes to close the circle. It is this fusion of two genomes that may have produced the eukaryotes.

Here, have some genes

Eukaryotes are cells with true nuclei, like the cells of our bodies. The simplest eukaryotes are the single-celled variety, like yeasts. Before eukaryotes arose, single-celled organisms without nuclei—called prokaryotes—ruled the Earth. We lumped them together in a single kingdom until comparatively recently, when taxonomists broke them into two separate domains, the Archaebacteria and the Eubacteria, with the eukaryotes making up a third. Archaebacteria are prokaryotes with a penchant for difficult living conditions, such as boiling-hot water. Eubacteria include today’s familiar representatives, Escherichia coli.

Genomic fusion

According to the findings of Lake and Rivera, the two prokaryotic domains may have fused genomes to produce the first representatives of the Eukarya domain. By analyzing complex algorithms of genomic relationships among 30 organisms—hailing from each of the three domains—Lake and Rivera produced various family “trees” of life on Earth, and found that the “trees” with the highest cumulative probabilities of having actually occurred really joined in a ring, or a fusion of two prokaryotic branches to form the eukaryotes. Recent research If we did that, the equivalent would be something like walking up to a grizzly bear and hand over some of your genes for it to incorporate. Being eukaryotes, that’s not something we do.

Our bacterial parentage: the union of Archaea and Eubacteria

Although not everyone buys into the “ring of life” concept, their findings help resolve some confusion over the origins of eukaryotes. When we first began analyzing the relationship of nucleated cells to prokaryotes, we identified a number of genes—that we call “informational” genes—that seemed to be directly inherited from the Archaea branch of the Tree of Life. Informational genes are involved in the processes like transcription and translation, and indeed, recent “ring of life” research suggests a greater role for Archaea. But we also found that many eukaryotic genes traced back to the Eubacteria domain, and that these genes were more organizational in nature, being involved in cell metabolism or lipid synthesis.

Applying the tree metaphor did not help resolve this confusion. If eukaryotes vertically inherited these genes from their prokaryotic ancestors, we would expect to see only genes representative of one domain or the other in eukaryotes. But we see both domains represented in the genes, and the best explanation is that organisms from each domain fused entire genomes—horizontally transferring genes—to produce a brand new organism, the progenitor of all eukaryotes: yeasts, trees, giraffes, killer whales, mice, … and us.

How the genetic code became degenerate

Our genetic code consists of 64 different combinations of four RNA nucleotides—adenine, guanine, cytosine, and uracil. These four molecules can be arranged in groups of three in 64 different ways; the mathematical representation of this relationship is 4 x 4 x 4 to illustrate the number of possible combinations.

Shorthand for the language of proteins

This code is cellular shorthand for the language of proteins. A group of three nucleotides—called a codon—is a code word for an amino acid. A protein is, at its simplest level, a string of amino acids, which are its building blocks. So a string of codons provides the language that the cell can “read” to build a protein. When the code is copied from the DNA, the process is called transcription, and the resulting string of nucleotides is messenger RNA. This messenger takes the code from the nucleus to the cytoplasm in eukaryotes, where it is decoded in a process called translation. During translation, the code is “read,” and amino acids assembled in the sequence the code indicates.

The puzzling degeneracy of genetics

So given that there are 64 possible triplet combinations for these codons, you might think that there are 64 amino acids, one per codon. But that’s not the case. Instead, our code is “degenerate;” in some cases, more than one triplet of nucleotides provides a code word for an amino acid. Thus, these redundant codons are all synonyms for the same protein building block. For example, six different codons indicate the amino acid leucine: UUA, UUG, CUA, CUG, CUC, and CUU. When any one of these codons turns up in the message, the cellular protein-building machinery inserts a leucine into the growing amino acid chain.

This degeneracy of the genetic code has puzzled biologists since the code was cracked. Why would Nature produce redundancies like this? One suggestion is that Nature did not use a triplet code originally, but a doublet code. Francis Crick, of double-helix fame, posited that a two-letter code probably preceded the three-letter code. But he did not devise a theory to explain how Nature made the universal shift from two to three letters.

A two-letter code?

There are some intriguing bits of evidence for a two-letter code. One of the players in translation is transfer RNA (tRNA), a special sequence of nucleotides that carries triplet codes complementary to those in the messenger RNA. In addition to this complementary triplet, called an anticodon, each tRNA also carries a single amino acid that matches the codon it complements. Thus, when a codon for leucine—UUA for example—is “read” during translation, a tRNA with the anticodon AAU will donate the leucine it carries to the growing amino acid chain.

Aminoacyl tRNA synthetases are enzymes that link an amino acid with the appropriate tRNA anticodon.  Each type of tRNA has its specific synthetase, and some of these synthetases use only the first two nucleotide bases of the anticodon to decide which amino acid to attach. If you look at the code words for leucine, for example, you’ll see that all four begin with “CU.” The only difference among these four is the third position in the codon—A, U, G, or C. Thus, these synthetases need to rely only on the doublets to be correct.

Math and doublets

Scientists at Harvard believe that they have solved the evolutionary mystery of how the triplet form arose from the doublet. They suggest that the doublet code was actually read in groups of three doublets, but with only the first two “prefix” or last two “suffix” pairs actually being read. Using mathematical modeling, these researchers have shown that all but two amino acids can be coded for using two, four, or six doublet codons.

Too hot in the early Earth kitchen for some

The two exceptions are glutamine and asparagine, which at high temperatures break down into the amino acids glutamic acid and aspartic acid. The inability of glutamine and asparagine to retain structure in hot environments suggests that the in the early days of life on Earth when doublet codes were in use, the primordial soup must have been too hot for stable synthesis of heat-intolerant, triplet-coded amino acids like glutamine and asparagine.

Think you're eating snapper? Think again

Grad students learn PCR, uncover fish fraud

It’s a great thing if you get your name published in the journal Nature, the pinnacle of publishing achievement for a biologist, while you’re still in school. Such was the fate of six graduate students participating in a course designed to teach them DNA extraction, amplification, and sequencing. They identified a real question to answer in the course of applying their techniques, and their results got them brief communication in Nature and national recognition. Not bad; I hope everyone also earned an “A.”

The group, led by professors Peter Marko and Amy Moran at the University of North Carolina-Chapel Hill, suspected that fish being sold as red snapper in markets in the U.S. were actually mislabeled, in violation of federal law. This kind of fraud is nothing new; marketers have in the past created “scallops” by cutting scalloped-shaped chunks from the wings of skates (part of the cartilaginous fish group), and have labeled the Patagonian toothfish as Chilean sea bass.

Protections can drive fraud

Such mislabeling has far-reaching implications, well beyond concerns about defrauding consumers of the fish they want. If fisheries and fish dealers are reporting their catches as red snapper or scallops or sea bass when they are, in fact, other marine species, then data on the abundance and distribution of all of these species will be misleading. Red snapper, Lutjanus campechanus, was placed under strict management in 1996, a move that gave incentive to the fishing industry and retailers to mislabel fish. Some experts suspect that many fish under heavy restriction end up with their names on a different species for market.

Who is responsible for the mislabeling? Fishermen pull in their catches and identify them on the boat or at the dock. The catch goes to a fish dealer, who is also responsible for reporting what species and how many of each species were caught. This report becomes the official number for the species. The dealer then sends the fish on to the retail market, where it is sold in stores and restaurants. Misidentification on the boat or dock is one reasonable possibility because some of the species identified in the North Carolina study frequent the same types of habitat, primarily offshore waters around coral reefs. These species, which include vermillion snapper and silk snapper, do look very much like red snapper, although there are some identifiable morphological differences.

One filet is just like the other?

So misidentification could be an honest mistake or purposeful change at the boat or dock, or it could be a willful relabeling at the restaurant or market. By the time a fish is processed, it consists essentially of a filet that is indistinguishable from that of other, similar fish. Hapless consumers end up paying twice as much for silk snapper, thinking they’re getting the pricier red snapper, instead.

But the DNA sequencing the North Carolina group performed not only turned up species closely related and very similar to red snapper, but also uncovered some sequences that have no identity with those of known species in gene databanks. In other words, fish of unknown identity are being caught, sold, and eaten as red snapper before we even have a chance to document what they are, their habitats, or their numbers.

Mislabeling is rampant

The grad students and professors also found that some of the fish being marketed as Atlantic red snapper were, in a few cases, from the other side of the planet, including the crimson snapper, which occurs in the Indo-West Pacific. All told, they found that 77% of the fish samples from stores in the eastern and midwestern U.S. were mislabeled as red snapper.

One way to prevent such mislabeling is to require identification of the country of origin of fish sold at market. The USDA has instituted such a program, although confusion will likely persist about fish caught in international waters. And the mislabeling isn’t only a U.S. phenomenon.

In the meantime, how do you know you’re getting red snapper? Some fish ecologists recommend avoiding it entirely because it still suffers from overfishing; however, one way to know your fish is to ask for it with the skin on, or completely intact. If you’ve got a smart phone, you can just look up the image and compare. Alternatively, you could just order the salad.

The piggish origins of civilization

Follow the pig

For researchers interested in tracing the path of human civilization from its birthplace in the Fertile Crescent to the rest of the world, they need only follow the path of the pig.

Pig toting

Until this research was reported, humans agreed that pigs had fallen under our magical domestication powers only twice about 9,000 years ago, once in what is called the Near East (Turkey), and a second time in what is called the Far East (China). Morphological and some genetic evidence seemed to point to these two events only. That led human geographers to conclude that humans must have toted domesticated pigs around from the Far or Near East to other parts of the world like Europe or Africa, rather than domesticating the wild boars they encountered in every new locale.

Occam’s Razor violation

As it turns out, those ideas—which enjoyed the support even of Charles Darwin—were wrong. And they provide a nice example of a violation Occam’s Razor, the rule that scientists should select the explanation that requires the fewest assumptions. In the case of the pig, two domestication events definitely required fewer assumptions than the many that we now believe to have occurred.

Research published in the journal Science in 2005 has identified at least seven occurrences of the domestication of wild boars. Two events occurred in Turkey and China, as previously thought, but the other five events took place in Italy, Central Europe, India, southeast Asia, and on islands off of southeast Asia, like Indonesia. Apparently, people arrived in these areas, corralled some wild boars, and ultimately domesticated them, establishing genetic lines that we have now traced to today.

As usual, molecular biology overrules everything else

The scientists uncovered the pig domestication pattern using modern molecular biology tools. They relied on a genetic tool known as the mitochondrial clock. Mitochondria have their own DNA, which they use as the code for their own, specialized mitochondrial proteins. Because mitochondria are essential to cell function and survival, changes in DNA coding sequences are rare because selection pressures against them are strong. For this reason, any changes are usually random changes in noncoding regions, changes that accumulate slowly and at a fairly predictable rate over time. This rate of accumulation is the mitochondrial clock, which we use to tick off the amount of time that has passed between mutations.

Tick-tock, mitochondrial clock

Very closely related individuals will have almost identical mitochondrial sequences; for example, the mitochondria that you have are probably exactly alike in sequence to the mitochondria your mother has. You inherited those mitochondria only from your mother, whose egg provided these essential organelles to the zygote that ultimately became you. Were someone to sample the mitochondria from one of your relatives thousands of years from now, they would probably find only a few changes, but if they compared this sample to one from someone unrelated to you, they would find different changes and a different number of changes, indicating less of a relationship.

That’s how the researchers figured out the mystery of the pigs. They sampled wild boars from each of the areas and sampled domestic pigs from the same locales. After comparing the mitochondrial DNA sequences among these groups, they found that pigs in Italy had sequences very like those of wild boars in Italy, while pigs in India had sequences very like those of wild boars there.

Approachable teenage-like pigs

How did we domesticate the pigs? Researchers speculate that adult boars (males) who still behaved like teenagers were most likely to approach human settlements to forage. They were also less aggressive than males who behaved like full adults, and thus, easier to domesticate. They fed better on human food scraps than did their more-mature—and more-skittish—brethren, and enjoyed better survival and more opportunities to pass on their juvenile characteristics, which also included shorter snouts, smaller tusks, and squealing, to their offspring. Domestication was just the next step.

Magnetic fields and the Q

Sorry, not for Trekkies. This Q is chemical.

People have been concerned for years about magnetic fields having adverse health effects–or even have peddled magnets as being health beneficial. But although scientists have demonstrated repeatedly a chemical response to magnetic fields, no one has ever shown the magnetic fields directly affecting an organism.

The earth’s core is weakly magnetic, the result of the attraction between electric currents moving in the same direction. Nature presents plenty of examples of animals that appear to use magnetic fields. Some bacteria can detect the fields and use them for movement. Birds appear to use magnetic fields to navigate, and researchers have shown that attaching magnets to birds interferes with their ability to navigate. Honey bees become confused in their dances when the earth’s magnetic fields fluctuate, and even amphibians appear to use magnetism for navigation. But no one has clearly demonstrated the mechanism by which animals sense and use magnetic fields.

Do pigeons use a compass?

Some research points to birds using tiny magnetic particles in their beaks to fly the right way. But these particles don’t tell the birds which way is north; they simply help the bird create a topographical map in its head of the earth over which it flies. The magnetic particles tell a pigeon there’s a mountain below, but not that the mountain is to the north. The conundrum has been to figure out how the pigeon knows which way is north in the absence of other pointers, such as constellations.

The answer to the conundrum lies with the bacteria. Scientists in the UK have used the purple bacterium Rhodobacter sphaeroides to examine what magnetic fields do at the molecular level. These bacteria are photosynthetic, and absorb light to convert to energy in the same way plants do. The absorbed light triggers a set of reactions that carry energy via electrons to the reaction center, where a pigment traps it. Under normal conditions, as the pigment traps the energy, it also almost instantaneously converts it to a stable, safe form, but sometimes the energy can form an excited molecule that can be biologically dangerous. As it turns out, these reactive molecules may be sensitive to magnetic fields.

A radical pair…or dangerous triplet

A chemical mechanism called the “Radical Pair Mechanism” is the method by which the potentially dangerous molecules can form. In this mechanism, an electron in an excited state may pair with another type of electron in an excited state. If the two excited molecules come together, they can form what is called a “radical pair in a singlet state,” because they are two singlets that have paired. Under normal conditions, this pairing does not happen; in the photosynthetic bacterium, for example, a compound called a quinone (Q) inhibits formation of this pair or of an equally damaging triplet of one electron type or the other.

But when a Q is not present, the singlet or triplet state results. If the triplet forms, it can interact with oxygen to produce a highly reactive, biologically damaging singlet molecule that we know as a “radical.” You have probably heard of radicals in the context of antioxidants—they are the molecules that antioxidants soak up to prevent their causing harm. You may also have heard of carotenoid, a pigment that is an antioxidant. In a normal photosynthetic bacterium, the carotenoids present serve as the Q, the compound that prevents formation of the damaging radical.

A helpful effect of magnetic fields?

Where do magnetic fields come in? Previous work indicated an influence of magnetic fields on triplet formation, and thus, on radical formation. One excellent model to test the effects of fields in a biological system is to remove the Q, the molecular sponge for the triplets, and then apply magnetic fields to see whether triplets—and radicals—form.

That’s exactly what the researchers did, using a mutated form of R. sphaeroides that did not make carotenoids—the Q. The result? The stronger the field, the less radical product was made. They have demonstrated a magnetic field effect in an organism for the first time, and the effect was helpful, not damaging. Their next step, which they are working on, is examining whether or not the bacteria grow better in the presence of the fields.

Platypus spur you? Grab a scorpion

The most painful egg-laying mammal: the platypus

The duckbill platypus is an impossible-looking, risible creature that we don’t typically associate with horrific pain. In fact, besides its odd looks, its greatest claim to fame is that it’s a mammal that lays eggs. But that’s just because you’re not paying close enough attention. On the hind legs of the male platypus are two spurs that inject a venom so painful, the recipient human writhes for weeks after the encounter. In spite of the fact that platypuses (platypi?) and humans don’t hang out together much, platypus venom contains a specific peptide–a short protein strand–that can directly bind to receptors on our nerve cells that then send signals of screeching pain to our brains. Ouch.

Hurting? Reach for a scorpion

If you’ve ever experienced platypus-level pain and taken pain killers for it, you know that they have…well…side effects. It’s because they affect more than the pain pathways of the body. The search for pharmaceuticals that target only the pain pathway–and, unlike platypus venom, inhibit it–forms a large part of the “rational design” approach to drug development. In other words, you rationally try to design things that target only the pathway of interest. In this case, researchers reached for the scorpion.

Their decision has precedent. In ancient Chinese medical practice, scorpion venom has been used as a pain reliever, or analgesic. But as developed as the culture was, the ancient Chinese didn’t have modern protein analysis techniques to identify the very proteins that bind only to the pain receptors and inhibit their activity. Now, a team from Israel is doing exactly that: teasing apart the various proteins in scorpion venom and testing their ability to bind pain receptors in human nerve cells.

The next step? Mimicry

With proteins in-hand, the next step will be to create a synthetic mimic that influences only the receptors of interest. It’s a brave new world out there, one where we wrestle proteins from scorpion venom and then make copycat molecules to ease our pain.

For your consideration

Why do you think the platypus makes proteins in its venom that human pain receptors can recognize if humans generally haven’t targeted platypuses (platypi?) as prey over its evolution?

In the human body, a receptor may be able to bind each of two closely related molecules–as a hormone receptor does with closely related hormones–but one of the molecules activates the receptor, while the other molecule inhibits it. Taking this as a starting point, why do you think some proteins in scorpion venom–which often causes intense pain–have the potential effect of alleviating pain?

%d bloggers like this: