Mitochondrial dysfunction and autism

Beautiful pic of mitochondria

Mitochondria are the powerhouses of the cell, the biology teachers will tell you. These organelles also happen to be likely former bacteria that once were independently living cells, capable of dividing on their own to make new mitochondria. Indeed, they continue to divide by a kind of binary fission as our cells divide, ensuring that a double dose is available for partitioning into the two new cells that result from cell division.

To achieve these feats, mitochondria have their own DNA, their own proteins, and their own protein-making machinery. That means that they also have the potential to undergo genetic mutations that affect the sequence of the proteins their genes encode. Because most of the proteins in mitochondria are mission critical and must function exactly right, the persistence of such mutations is relatively rare. But they do happen, causing disease. One question that has arisen in the study of the causes of autism is whether or not such changes might underlie at least a portion of the cases of this developmental difference.

The high-profile Hannah Poling case

Certainly lending a high profile to this question was the case of Hannah Poling, whose mitochondrial disorder appeared to be linked to her autism symptoms and may have interacted with a bolus of vaccine doses she received, followed by a high fever. Fevers can tax our cellular powerhouses, and if mitochondrial function is already compromised, the high temperatures and extra burden may result in chronic negative outcomes.

Poling’s case brought to the forefront the question of whether or not people with autism might have mitochondrial dysfunction at greater rates. A recent study in the Journal of the American Medical Association (which steadfastly keeps its articles unavailable behind a paywall) has sought to address that question by measuring markers of mitochondrial dysfunction in children with autism and comparing these endpoints with outcomes in children without autism.

Study specifics: “Full-syndrome autism”

The autistic group in the study had what the researchers called “full syndrome autism,” which I take to mean intense symptoms of autism. They used the Autism Diagnostic Inventory-Revised

(ADI-R) and the Autism Diagnostic Observation Schedule (ADOS) to confirm this diagnosis and to ensure as uniform a population among their autistic group as possible. Ultimately, the study included 10 children in this group, recruited consecutively in the clinic based on their fulfillment of the selection criteria. This study was essentially case control, meaning that the control group consisted of 10 non-autistic children, selected to match as closely as possible the demographic characteristics of the autistic group.

The authors report that while only one child among the 10 who were autistic fulfilled the definitive criteria for a mitochondrial respiratory chain disorder, the children with autism were more likely to have indicators of mitochondrial dysfunction.

A problem with pyruvate dehydrogenase (break out your Krebs notes, folks)

Specifically, six out of ten showed lowered levels of activity for one parameter, while eight out of ten showed higher levels than controls for another metabolic endpoint, and two of ten showed higher levels than controls of a third metabolic endpoint. Overall, the results indicated low activity of a mitochondria-specific enzyme, pyruvate dehydrogenase, which is involved in one of the first steps of carbohydrate metabolism that takes place in the mitochondria. Reduced activity of an enzyme anywhere in this process will result in changes in the enzyme’s own products and products further down the pathway and throw off mitochondrial function. Further, half of the autistic group exhibited higher levels of DNA replication, an indicator of cellular stress, more frequently than controls and also had more deletions in their DNA than controls. Statistical analysis suggested that all of these differences were significant.

What does it mean for autism?

Do these findings mean that all or most people with autism have mitochondrial dysfunction? No. The study results do not support that conclusion. Further, the authors themselves list six limitations of the study. These include the possibility that some findings of statistical significance could be in error because of sample size or confounders within the sample and that there were changes in some of the endpoints in the autistic group in both directions. In other words, some autistic children had much higher values than controls, while some had lower values, muddying the meaning of the statistics. The authors note that a study like this one does not allow anyone to draw conclusions about a cause-and-effect association between autism and mitochondria, and they urge caution with regard to generalizing the findings to a larger population.

If there is an association, questions arise from that conclusion. Does mitochondrial dysfunction underlie autism, producing autistic-like symptoms, as some argued in the Hannah Poling case? Or, do autistic manifestations such as anxiety or high stress or some other autism-related factor influence the mitochondria?

Chickens, eggs, MRI, mitochondria, autism

As interesting as both of these recent autism-related studies are, we still have the “Which came first” question to deal with. Did autism cause the brain or mitochondrial differences, or did the brain or mitochondrial differences trigger the autism? Right now, these chicken-and-egg questions may not matter as much as the findings do for helping to identify autism more specifically and addressing some of its negative aspects. Regardless of your stance on neurodiversity or vaccine or acceptance or cure or the in-betweens where most of us fall, it would be difficult to argue that a mitochondrial dysfunction shouldn’t be identified and ameliorated or that an awareness of brain structure differences won’t lead to useful information about what drives autism behaviors.

——————————————–

Note: More lay-accessible versions of this post and the previous post are available at BlogHer.

Did humans and their fire kill off Australia’s megafauna?

Genyornis. Courtesy of Michael Ströck & Wikimedia Commons.

Timeline, 2005: For those of us who do not live in Australia (and live instead in, say, boring old Texas), the animals that live on that continent can seem like some of the most exotic species in the world. The kangaroo, wombat, and Tasmanian devil, and most of all, the platypus, are high on the list of the unusual and bizarre in the animal kingdom.

But modern-day Australia has nothing on the Australia of 50,000 years ago when humans first arrived from Java. They encountered huge kangaroos, marsupial lions, 25-foot lizards, and tortoises the size of a subcompact car. Yet, within 5000 years, many of these animals had disappeared permanently. And since the dawn of the study of paleontology, researchers have wondered why.

Of course, it’s our fault

Of course, humans feature as the culprits in most scenarios. Just as the first people in the Americas are usually blamed at least in part for the disappearance of the American megafauna, like mammoths or giant sloths, the first people in Australia have also been suspected of hunting these animals to extinction or exposing them to diseases that decimated the populations.

As it turns out, humans may be to blame, but not through direct destruction or disease transmission. Instead, it may be the mastery of fire, the turning point in our cultural history, that ended in the extinction of many species larger than 100 pounds on the Australian continent.

Fire!

Australia’s first people probably set huge fires to signal to one another, flush animals for hunting, clear paths through what was once a mosaic of trees, shrubs, and grasses, or to encourage the growth of specific plants. The byproduct of all of this burning was catastrophic to the larger species on the continent.

The fires, according to one study, wiped out the drought-adapted plants that covered the continent’s interior, leaving behind a desert of scrub brush. The change in plant cover may have resulted in a decrease in water vapor exchange between the earth and the atmosphere with the ultimate effect of ending the yearly monsoon rains that would quench the region. Without the rains, only the hardiest, desert-ready plants survived.

You are what you eat…or ate

How could researchers possibly have elucidated these events of 45,000 years ago? By looking at fossilized bird eggs and wombat teeth. Using isotopic techniques, they assessed the types of carbon present in the bird eggs and teeth that dated back from 150,000 to 45,000 years ago. These animals genuinely were what they ate in some ways, with some isotopic markers of their diet accumulating in these tissues. Because plants metabolize different forms of carbon in different ways, the researchers could link the type of carbon isotopes they found in the egg and teeth fossils to the diet of these animals.

They found that the diet of a now-extinct species of bird, the Genyornis, consisted of the nutritious grasses of the pre-human Australian landscape. Emu eggs from before 50,000 years ago pointed to a similar diet, but eggs from 45,000 years ago indicated a shift in emu diet from nutritious grasses to the desert trees and shrubs of the current Australian interior. The vegetarian wombats also appear to have made a similar change in their diets around the same time.

Or, maybe not

And the species that are still here today, like the emu and the wombat, are the species that were general enough in their dietary needs to make the shift. The Genyornis went the way of the mammoth, possibly because its needs were too specialized for it to shift easily to a different diet. Its teeth showed no change in diet over the time period.

The researchers analyzed 1500 fossilized eggshell specimens from Genyornis and emu to solve this mystery and to pinpoint human burning practices as the culprits in the disappearance of these megafauna in a few thousand brief years. Today’s aboriginal Australians still use burning in following traditional practices, but by this time, the ecosystems have had thousands of years to adapt to burns. Thus, we don’t expect to see further dramatic disappearances of Australian fauna as a result of these practices. Indeed, some later researchers have taken issue with the idea that fire drove these changes in the first place, with some blaming hunting again, and as with many things paleontological, the precise facts of the situation remain…lost in the smoky haze of deep history.

Roll over eggs…it’s time for (unrolled) tobacco leaves

Tobacco leaf infected with Tobacco Mosaic Virus. Courtesy of Clemson University - USDA Cooperative Extension Slide Series

Timeline, 2008: If you’ve ever been asked about allergy to egg products before receiving a flu vaccine, you have had a little encounter with the facts of vaccine making. Flu viruses to produce the vaccine are painstakingly grown in chicken eggs because eggs make perfect little incubators for the bugs.

So…many…eggs

There are problems—in addition to the allergy issue—that arise with this approach. First of all, growing viruses for a million vaccine doses usually means using a million fertilized, 11-day-old eggs. For the entire population of the United States, 300 million eggs would be required. Second, the process requires months of preparation, meaning a slow turnaround time for vaccines against a fast-moving, fast-changing disease. Last, if there is anything wrong with the eggs themselves, such as contamination, the whole process is a waste and crucial vaccines are lost.

The day may come when we can forget about eggs and turn to leaves. Plants can contract viral disease just like animals do. In fact, an oft-used virus in some research fields is the tobacco mosaic virus, which, as its name implies, infects tobacco plants. It gives a patchy look to the leaves of infected plants, and researchers use this feature to determine whether the virus has taken hold.

Bitter little avatars of evil used for good?

Tobacco plants themselves, bitter little avatars of evil for their role in the health-related effects of smoking, serve a useful purpose in genetic research and have now enhanced their approval ratings for their potential in vaccine production. Plants have caught the eye of vaccine researchers for quite a while because they’re cheaper and easier to work with than animal incubators. Using plants for quick-turnaround vaccine production has been a goal, but a few problems have hindered progress.

To use a plant to make a protein to make a vaccine, researchers must first get the gene for the protein into the plant. Previous techniques involved tedious and time-consuming processes for inserting the gene into the plant genome. Then, clock ticking, there was the wait for the plant to grow and make the protein. Add in the Byzantine process of obtaining federal approval to use a genetically modified plant, and you’ve got the opposite of “rapid” on your hands.

One solution to this problem would simply be to get the gene into the plant cell cytoplasm for immediate use. It’s possible but involves meticulously injecting a solution with the gene sequence into each leaf. Once the gene solution is in, the plant will transcribe it—copy it into mRNA—in the cell cytoplasm and then build the desired protein based on the mRNA code. But there has been no way to take hand injection to the large-scale production of proteins, including for vaccines.

Age-old vacuum suction =  high-tech high-throughput

To solve this problem, researchers turned to one of our oldest technologies: vacuum suction. They grew tobacco plants to maturity and then clipped off the leaves, which they submerged in a solution. The solution was spiked with a nasty bug, Agrobacterium tumefaciens, a pathogen responsible for the growth of galls, or tumors, on plants. Anyone working in agriculture fears this bacterium, a known destroyer of grapes, pitted fruit trees, and nut trees. But it does have one useful feature for this kind of work: It can insert bits of its DNA into plant cells. The researchers tricked A. tumefaciens into inserting another bit of DNA instead, the code for the protein they wanted to make.

To get the solution close to the cells, the investigators had to get past air bubbles, and that’s where the vacuum came in. They placed the submerged leaves into a vacuum chamber and flipped a switch, and the activated chamber sucked all the air out of the leaves. When the vacuum was turned off, the solution flowed into the now-empty chambers of the leaf, allowing the A. tumefaciens-spiked solution to bathe the plant cells. After 4 days and a few basic protein-extraction steps, the research team had its protein batch. According to the team lead, “any protein” could be made using this process, opening up almost unlimited possibilities and applications for this approach.

Vaccines…or combating bioterrorism?

The technology has come far enough that a US company has taken steps toward manufacturing vaccines using tobacco leaves.  And it appears that the applications go beyond vaccines, as one news story has noted…the tobacco plants might also be used to produce antidotes to common agents of bioterrorism.

Is the tree of life really a ring?

A proposed ring of life

The tree of life is really a ring

When Darwin proposed his ideas about how new species arise, he produced a metaphor that we still adhere to today to explain the branching patterns of speciation: The Tree of Life. This metaphor for the way one species may branch from another through changes in allele frequencies over time is so powerful and of such long standing that many large studies of the speciation process and of life’s origins carry its name.

It may be time for a name change. In 2004, an astrobiologist and molecular biologist from UCLA found that a ring metaphor may better describe the advent of earliest eukaryotes. Astrobiologists study the origins of life on our planet because of the potential links between these earthly findings and life on other planets. Molecular biologists can be involved in studying the evolutionary patterns and relationships that our molecules—such as DNA or proteins—reveal. Molecular biologist James Lake and astrobiologist Mary Rivera of UCLA teamed up to examine how genomic studies might reveal some clues about the origins of eukaryotes on Earth.

Vertical transfer is so 20th century

We’ve heard of the tree of life, in which one organism begets another, passing on its genes in a vertical fashion, with this vertical transfer of genes producing a tree, with each new production becoming a new branch. The method of gene transfer that would produce a genuine circle, or ring, is horizontal transfer, in which two organisms fuse genomes to produce a new organism. The ends of the branches in this scenario fuse together via their genomes to close the circle. It is this fusion of two genomes that may have produced the eukaryotes.

Here, have some genes

Eukaryotes are cells with true nuclei, like the cells of our bodies. The simplest eukaryotes are the single-celled variety, like yeasts. Before eukaryotes arose, single-celled organisms without nuclei—called prokaryotes—ruled the Earth. We lumped them together in a single kingdom until comparatively recently, when taxonomists broke them into two separate domains, the Archaebacteria and the Eubacteria, with the eukaryotes making up a third. Archaebacteria are prokaryotes with a penchant for difficult living conditions, such as boiling-hot water. Eubacteria include today’s familiar representatives, Escherichia coli.

Genomic fusion

According to the findings of Lake and Rivera, the two prokaryotic domains may have fused genomes to produce the first representatives of the Eukarya domain. By analyzing complex algorithms of genomic relationships among 30 organisms—hailing from each of the three domains—Lake and Rivera produced various family “trees” of life on Earth, and found that the “trees” with the highest cumulative probabilities of having actually occurred really joined in a ring, or a fusion of two prokaryotic branches to form the eukaryotes. Recent research If we did that, the equivalent would be something like walking up to a grizzly bear and hand over some of your genes for it to incorporate. Being eukaryotes, that’s not something we do.

Our bacterial parentage: the union of Archaea and Eubacteria

Although not everyone buys into the “ring of life” concept, their findings help resolve some confusion over the origins of eukaryotes. When we first began analyzing the relationship of nucleated cells to prokaryotes, we identified a number of genes—that we call “informational” genes—that seemed to be directly inherited from the Archaea branch of the Tree of Life. Informational genes are involved in the processes like transcription and translation, and indeed, recent “ring of life” research suggests a greater role for Archaea. But we also found that many eukaryotic genes traced back to the Eubacteria domain, and that these genes were more organizational in nature, being involved in cell metabolism or lipid synthesis.

Applying the tree metaphor did not help resolve this confusion. If eukaryotes vertically inherited these genes from their prokaryotic ancestors, we would expect to see only genes representative of one domain or the other in eukaryotes. But we see both domains represented in the genes, and the best explanation is that organisms from each domain fused entire genomes—horizontally transferring genes—to produce a brand new organism, the progenitor of all eukaryotes: yeasts, trees, giraffes, killer whales, mice, … and us.

Magnetic fields and the Q

Sorry, not for Trekkies. This Q is chemical.

People have been concerned for years about magnetic fields having adverse health effects–or even have peddled magnets as being health beneficial. But although scientists have demonstrated repeatedly a chemical response to magnetic fields, no one has ever shown the magnetic fields directly affecting an organism.

The earth’s core is weakly magnetic, the result of the attraction between electric currents moving in the same direction. Nature presents plenty of examples of animals that appear to use magnetic fields. Some bacteria can detect the fields and use them for movement. Birds appear to use magnetic fields to navigate, and researchers have shown that attaching magnets to birds interferes with their ability to navigate. Honey bees become confused in their dances when the earth’s magnetic fields fluctuate, and even amphibians appear to use magnetism for navigation. But no one has clearly demonstrated the mechanism by which animals sense and use magnetic fields.

Do pigeons use a compass?

Some research points to birds using tiny magnetic particles in their beaks to fly the right way. But these particles don’t tell the birds which way is north; they simply help the bird create a topographical map in its head of the earth over which it flies. The magnetic particles tell a pigeon there’s a mountain below, but not that the mountain is to the north. The conundrum has been to figure out how the pigeon knows which way is north in the absence of other pointers, such as constellations.

The answer to the conundrum lies with the bacteria. Scientists in the UK have used the purple bacterium Rhodobacter sphaeroides to examine what magnetic fields do at the molecular level. These bacteria are photosynthetic, and absorb light to convert to energy in the same way plants do. The absorbed light triggers a set of reactions that carry energy via electrons to the reaction center, where a pigment traps it. Under normal conditions, as the pigment traps the energy, it also almost instantaneously converts it to a stable, safe form, but sometimes the energy can form an excited molecule that can be biologically dangerous. As it turns out, these reactive molecules may be sensitive to magnetic fields.

A radical pair…or dangerous triplet

A chemical mechanism called the “Radical Pair Mechanism” is the method by which the potentially dangerous molecules can form. In this mechanism, an electron in an excited state may pair with another type of electron in an excited state. If the two excited molecules come together, they can form what is called a “radical pair in a singlet state,” because they are two singlets that have paired. Under normal conditions, this pairing does not happen; in the photosynthetic bacterium, for example, a compound called a quinone (Q) inhibits formation of this pair or of an equally damaging triplet of one electron type or the other.

But when a Q is not present, the singlet or triplet state results. If the triplet forms, it can interact with oxygen to produce a highly reactive, biologically damaging singlet molecule that we know as a “radical.” You have probably heard of radicals in the context of antioxidants—they are the molecules that antioxidants soak up to prevent their causing harm. You may also have heard of carotenoid, a pigment that is an antioxidant. In a normal photosynthetic bacterium, the carotenoids present serve as the Q, the compound that prevents formation of the damaging radical.

A helpful effect of magnetic fields?

Where do magnetic fields come in? Previous work indicated an influence of magnetic fields on triplet formation, and thus, on radical formation. One excellent model to test the effects of fields in a biological system is to remove the Q, the molecular sponge for the triplets, and then apply magnetic fields to see whether triplets—and radicals—form.

That’s exactly what the researchers did, using a mutated form of R. sphaeroides that did not make carotenoids—the Q. The result? The stronger the field, the less radical product was made. They have demonstrated a magnetic field effect in an organism for the first time, and the effect was helpful, not damaging. Their next step, which they are working on, is examining whether or not the bacteria grow better in the presence of the fields.

Identical twins grow less identical

DNA sequence is just a starting point

Identical twins are identical only in that their DNA is the same. In what might be an argument against cloning yourself or a pet hoping to get an identical reproduction, scientists have found that having an identical genetic code does not translate into being exactly alike. We have long known that identical twins do not always share the same health fate, for example. One twin can have schizophrenia while another twin may never develop it. Or one twin might develop cancer or diabetes, while the other remains disease-free, even though there can be a strong genetic component to all of these disorders.

So a burning question in the field of genetics and disease has been identifying the difference between a twin who gets a disease and one who does not. A strong candidate mechanism has been the process of genomic modification in which molecules attached to the DNA can silence a gene or turn it on. Typically, methyl groups attached to DNA will make the code unavailable, and acetyl groups attached to the histone proteins that support DNA will ensure that the code is used.

Chemical tags modify DNA sequences

This process of genomic regulation is involved in some interesting aspects of biology. For example, methylation is the hallmark of genomic imprinting, in which each set of genes we inherit from our parents comes with its own special pattern of methylation. The way some genetic disorders manifest can be traced to genomic imprinting. In Prader-Willi syndrome, a person inherits a paternal mutant allele and manifests characteristic symptoms of the disorder, which include obesity and intellectual disability. But people who inherit the same mutant allele from the mothers will instead have Angelman’s syndrome, in which they are small and gracile, have a characteristic elfin face, and also have intellectual disability. Modification from methyl or acetyl groups, also called epigenetic modification, plays a role in dosage compensation for the X chromosome. Women, who have two X chromosomes, shut most of one down through methylation to produce an X chromosome gene dosage like that of men, who have a single X.

Twinning: Nature’s clones

Identical twins have identical DNA because they arise from a single fertilized egg. The egg divides mitotically into two identical cells, and then each cell, for reasons we don’t understand well, resets the developmental process to the beginning and develops as a new individual. The process of twinning carries interesting implications for bioethics, cloning discussions, and questions about when life begins, but it also has helped us tease apart the influences of genetics and environment. A recent study examining life history differences and differences in epigenetic modification in 80 pairs of twins ranging in age from 3 to 74 has revealed some fascinating results that have implications for our understanding of nature vs. nurture and our investigations into the role of epigenesis in development of disease.

You are what you do to yourself

The older the twins were, the more differences researchers found in methylation or acetylation of their DNA and histones. For twins raised apart, these differences were even more extreme. Researchers also concluded that environmental influences, such as smoking, diet, and lifestyle, may have contributed to the differences in the twins’ epigenetic modifications. The three-year-old twins were almost identical in their methylation patterns, but for twins older than 28 years, the patterns were significantly different for 60 percent of the pairs.

These results have major implications for our understanding of disease. For example, we can use this knowledge to identify genes that are differently methylated in people with and without a disorder and use that as a lead in identifying the genes involved in that disease state. We also may be able to pinpoint which environmental triggers result in differential methylation and find ways to avoid this mechanism of disease.

Just eat the broccoli, preferably steamed

A presidential/vegetable debacle

When the first President Bush indicated a distaste for broccoli during his presidency, he set off a firestorm of vegetable-based controversy. Broccoli haters sympathized, but nutritionists were horrified. Turns out, the data are all on the side of the nutritionists.

Research has shown that people who eat cruciferous vegetables—for example, broccoli, cabbage, or cauliflower—have lower rates of cancer. Armed with this information, researchers around the country have sought the cancer-fighting ingredient in broccoli and tried different ways to maximize the benefits we can get from it.

Yes, it’s got healthy stuff in it

They pinpointed the compound of interest in 1992. It is sulforaphane, a phytochemical in a group called the isothiocyanates. These compounds are known to induce apoptosis, or programmed cell death, of cancer cells. Epidemiologists—people who trace and track diseases and their causes—have found lower rates of prostate cancer in men who eat diets high in isothiocyanates. Scientists found that the way we release these cancer-fighting compounds from our veggies is to cut or chew them.

Sulforaphane, one of the most powerful anticarcinogens—anti-cancer compounds—in our food, works through the liver. Our livers are responsible for detoxifying our bodies, which happens in phases. In some cases, liver enzymes can actually turn compounds into carcinogens. But our phase II liver enzymes break down toxins before they can damage our DNA, the molecule that houses our genetic code. Sulforaphane boosts these phase II enzymes, thus wielding its anti-cancer power.

Broccoli pill–or just broccoli?

When the world found out broccoli’s secrets, there was a craze for broccoli-derived supplements that contained sulforaphane. Broccoli-based pills, powders, and teas hit the shelves, and people everywhere crunched into broccoli, possibly wincing like our former president might have as they chewed it up and forced it down. But what they may not have realized is that mature broccoli also has some compounds help liver enzymes that turn compounds into carcinogens. The goal was to find a way to deliver sulforaphane without delivering too much of these less-benign accompaniments, or too much sulforaphane, which can be toxic in large amounts.

One approach was to synthesize a sulforaphane that retained its anti-cancer powers, but was not as toxic in high doses. This compound is called oxomate, and it has been successful in the lab in reducing breast tumors in rats. Another approach was to try to find a broccoli growth stage that made sulforaphane in reasonable amounts, but that did not produce the compounds that elicited the liver’s carcinogen-producing enzymes. Scientists following that line of research discovered that broccoli sprouts, tiny alfalfa-like sprouts grown from broccoli seeds, contain abundant and safe levels of sulforaphane, but do not synthesize the unwelcome compounds that boost carcinogenesis.

When proteins attack

But one thing that food scientists had found was that even when broccoli was consumed, its sulforaphane could be immediately disabled by broccoli protein called the epithiospecifier protein. Sulforaphane in the plant is attached to a sugar via a sulfur bond. When we cut or chew the broccoli, we can activate the enzyme that breaks the sulforaphane off of the sugar. But then there’s that sulfur flapping in the breeze, and the epithiospecifier comes along, yanks off the sulfur, and inactivates the sulforphane. Food researchers have found that cooking broccoli for 10 minutes at 140 degrees Fahrenheit kills off the epithiospecifier protein, leaving behind intact sulforaphane and the enzyme that releases it from the sugar to do its good work in our bodies.

So if you’ve been choking down overcooked broccoli all in the name of health, you may have been reaping the benefits of fiber or other broccoli-derived nutrients, but you weren’t getting much sulforaphane out of it. Your best bet, according to researchers, is to steam fresh broccoli for about three or four minutes and eat it, cheese sauce optional.

%d bloggers like this: