Monday, September 13, 2021

Recommended reading for the remaining fans of Richard Dawkins

 

Pseudogenes Aren’t Nonfunctional Relics that Refute Intelligent Design

Casey Luskin

September 9, 2021, 6:36 AM


https://evolutionnews.org/2021/09/pseudogenes-arent-nonfunctional-relics-that-refute-intelligent-design/


[[If you need an example of the tone and accuracy of the debate about evolution, here is a good example. I highly recommend Dr. Luskin’s work in general.]]


Photo by Ann Kathrin Bopp via Unsplash.

We’ve been discussing a video in which Richard Dawkins claims that the evidence for common ancestry refutes intelligent design (see here, here, and here). We first saw that contrary to Dawkins, the genetic data does not yield “a perfect hierarchy” or “perfect family tree.” Then we saw that a treelike data structure does not necessarily refute intelligent design. But Dawkins isn’t done. At the end of his answer in the video, Dawkins raises the issue of “pseudogenes,” which he claims “don’t do anything but are vestigial relicts of genes that once did something.” Dawkins says elsewhere that pseudogenes “are never transcribed or translated. They might as well not exist, as far as the animal’s welfare is concerned.” These claims represent a classic but false “junk DNA” argument against intelligent design. 

Functions of Pseudogenes 

Pseudogenes can yield functional RNA transcripts, functional proteins, or perform a function without producing any transcript. A 2012 paper in Science Signaling noted that although “pseudogenes have long been dismissed as junk DNA,” recent advances have established that “the DNA of a pseudogene, the RNA transcribed from a pseudogene, or the protein translated from a pseudogene can have multiple, diverse functions and that these functions can affect not only their parental genes but also unrelated genes.” The paper concludes that “pseudogenes have emerged as a previously unappreciated class of sophisticated modulators of gene expression.” 

A 2011 paper in the journal RNA concurs:

Pseudogenes have long been labeled as ‘junk’ DNA, failed copies of genes that arise during the evolution of genomes. However, recent results are challenging this moniker; indeed, some pseudogenes appear to harbor the potential to regulate their protein-coding cousins. 

Likewise, a 2012 paper in RNA Biology states that “pseudogenes were long considered as junk genomic DNA” but “pseudogene regulation is widespread in eukaryotes.” Because pseudogenes may only function in specific tissues and/or only during particular stages of development, their true functions may be difficult to detect. The RNA Biology paper concludes that “the study of functional pseudogenes is just at the beginning” and predicts “more and more functional pseudogenes will be discovered as novel biological technologies are developed in the future.” 

When we do carefully study pseudogenes, we often find function. One paper in Annual Review of Genetics observed: “pseudogenes that have been suitably investigated often exhibit functional roles.” A 2020 paper in Nature Reviews Genetics cautioned that pseudogene function is “Prematurely Dismissed” due to “dogma.” The paper cautions that there are many instances where DNA that was dismissed as pseudogene junk was later found to be functional: “with a growing number of instances of pseudogene-annotated regions later found to exhibit biological function, there is an emerging risk that these regions of the genome are prematurely dismissed as pseudogenic and therefore regarded as void of function.” Indeed, the literature is full of papers reporting function in what have been wrongly labeled “pseudogenes.” 

Fingers in Ears?

At the end of the video, Dawkins says: “I find it extremely hard to imagine how any creationist who actually bothered to listen to that could possibly doubt the fact of evolution. But they don’t listen…they simply stick their fingers in their ear and say la la la.” It’s safe to say that Dawkins was wrong about many things in this video, but I’m not here to make any accusations about fingers and ears. I will say that the best resolution to these kinds of questions is to listen to the data, keep an open mind, and to think critically. When we’re wiling to do this, a lot of exciting new scientific possibilities open up — ones that don’t necessarily include traditional neo-Darwinian views of common ancestry or a “perfect hierarchy” in the tree of life, and ones that readily point toward intelligent design. 


Sunday, September 5, 2021

More evidence that more DNA is useful

 

The Complex Truth About ‘Junk DNA’

Genomes hold immense quantities of noncoding DNA. Some of it is essential for life, some seems useless, and some has its own agenda.

 

The 98% of the human genome that does not encode proteins is sometimes called junk DNA, but the reality is more complicated than that name implies.

Samuel Velasco/Quanta Magazine

Jake Buehler

Contributing Writer


September 1, 2021

 

https://www.quantamagazine.org/the-complex-truth-about-junk-dna-20210901/?utm_source=Quanta+Magazine&utm_campaign=a34a5832b8-RSS_Daily_Biology&utm_medium=email&utm_term=0_f0cb61321c-a34a5832b8-389846569&mc_cid=a34a5832b8&mc_eid=61275b7d81

Imagine the human genome as a string stretching out for the length of a football field, with all the genes that encode proteins clustered at the end near your feet. Take two big steps forward; all the protein information is now behind you.

The human genome has three billion base pairs in its DNA, but only about 2% of them encode proteins. The rest seems like pointless bloat, a profusion of sequence duplications and genomic dead ends often labeled “junk DNA.” This stunningly thriftless allocation of genetic material isn’t limited to humans: Even many bacteria seem to devote 20% of their genome to noncoding filler.

Many mysteries still surround the issue of what noncoding DNA is, and whether it really is worthless junk or something more. Portions of it, at least, have turned out to be vitally important biologically. But even beyond the question of its functionality (or lack of it), researchers are beginning to appreciate how noncoding DNA can be a genetic resource for cells and a nursery where new genes can evolve.

“Slowly, slowly, slowly, the terminology of ‘junk DNA’ [has] started to die,” said Cristina Sisu, a geneticist at Brunel University London.

Scientists casually referred to “junk DNA” as far back as the 1960s, but they took up the term more formally in 1972, when the geneticist and evolutionary biologist Susumu Ohno used it to argue that large genomes would inevitably harbor sequences, passively accumulated over many millennia, that did not encode any proteins. Soon thereafter, researchers acquired hard evidence of how plentiful this junk is in genomes, how varied its origins are, and how much of it is transcribed into RNA despite lacking the blueprints for proteins.

Technological advances in sequencing, particularly in the past two decades, have done a lot to shift how scientists think about noncoding DNA and RNA, Sisu said. Although these noncoding sequences don’t carry protein information, they are sometimes shaped by evolution to different ends. As a result, the functions of the various classes of “junk” — insofar as they have functions — are getting clearer.

Cells use some of their noncoding DNA to create a diverse menagerie of RNA molecules that regulate or assist with protein production in various ways. The catalog of these molecules keeps expanding, with small nuclear RNAs, microRNAs, small interfering RNAs and many more. Some are short segments, typically less than two dozen base pairs long, while others are an order of magnitude longer. Some exist as double strands or fold back on themselves in hairpin loops. But all of them can bind selectively to a target, such as a messenger RNA transcript, to either promote or inhibit its translation into protein.

These RNAs can have substantial effects on an organism’s well-being. Experimental shutdowns of certain microRNAs in mice, for instance, have induced disorders ranging from tremors to liver dysfunction.

By far the biggest category of noncoding DNA in the genomes of humans and many other organisms consists of transposons, segments of DNA that can change their location within a genome. These “jumping genes” have a propensity to make many copies of themselves — sometimes hundreds of thousands — throughout the genome, says Seth Cheetham, a geneticist at the University of Queensland in Australia. Most prolific are the retrotransposons, which spread efficiently by making RNA copies of themselves that convert back into DNA at another place in the genome. About half of the human genome is made up of transposons; in some maize plants, that figure climbs to about 90%.

Noncoding DNA also shows up within the genes of humans and other eukaryotes (organisms with complex cells) in the intron sequences that interrupt the protein-encoding exon sequences. When genes are transcribed, the exon RNA gets spliced together into mRNAs, while much of the intron RNA is discarded. But some of the intron RNA can get turned into small RNAs that are involved in protein production. Why eukaryotes have introns is an open question, but researchers suspect that introns help accelerate gene evolution by making it easier for exons to be reshuffled into new combinations.

A large and variable portion of the noncoding DNA in genomes consists of highly repeated sequences of assorted lengths. The telomeres capping the ends of chromosomes, for example, consist largely of these. It seems likely that the repeats help to maintain the integrity of chromosomes (the shortening of telomeres through the loss of repeats is linked to aging). But many of the repeats in cells serve no known purpose, and they can be gained and lost during evolution, seemingly without ill effects.


One category of noncoding DNA that intrigues many scientists these days is the pseudogenes, which are usually viewed as the remnants of working genes that were accidentally duplicated and then degraded through mutation. As long as one copy of the original gene works, natural selection may exert little pressure to keep the redundant copy intact.

Akin to broken genes, pseudogenes might seem like quintessential genomic junk. But Cheetham warns that some pseudogenes may not be “pseudo” at all. Many of them, he says, were presumed to be defective copies of recognized genes and labeled as pseudogenes without experimental evidence that they weren’t functional.

Pseudogenes can also evolve new functions. “Sometimes they can actually control the activity of the gene from which they were copied,” Cheetham said, if their RNA is similar enough to that of the working gene to interact with it. Sisu notes that the discovery in 2010 that the PTENP1 pseudogene had found a second life as an RNA regulating tumor growth convinced many researchers to look more closely at pseudogene junk.

Because dynamic noncoding sequences can produce so many genomic changes, the sequences can be both the engine for the evolution of new genes and the raw material for it. Researchers have found an example of this in the ERVW-1 gene, which encodes a protein essential to the development of the placenta in Old World monkeys, apes and humans. The gene arose from a retroviral infection in an ancestral primate about 25 million years ago, hitching a ride on a retrotransposon into the animal’s genome. The retrotransposon “basically co-opted this element, jumping around the genome, and actually turned that into something that’s really crucial for the way that humans develop,” Cheetham said.

But how much of this DNA therefore qualifies as true “junk” in the sense that it serves no useful purpose for a cell? This is hotly debated. In 2012, the Encyclopedia of DNA Elements (Encode) research project announced its findings that about 80% of the human genome seemed to be transcribed or otherwise biochemically active and might therefore be functional. However, this conclusion was widely disputed by scientists who pointed out that DNA can be transcribed for many reasons that have nothing to do with biological utility.

Alexander Palazzo of the University of Toronto and T. Ryan Gregory of the University of Guelph have described several lines of evidence — including evolutionary considerations and genome size  — that strongly suggest “eukaryotic genomes are filled with junk DNA that is transcribed at a low level.” Dan Graur of the University of Houston has argued that because of mutations, less than a quarter of the human genome can have an evolutionarily preserved function. Those ideas are still consistent with the evidence that the “selfish” activities of transposons, for example, can be consequential for the evolution of their hosts.

Cheetham thinks that dogma about “junk DNA” has weighed down inquiry into the question of how much of it deserves that description. “It’s basically discouraged people from even finding out whether there is a function or not,” he said. On the other hand, because of improved sequencing and other methods, “we’re in a golden age of understanding noncoding DNA and noncoding RNA,” said Zhaolei Zhang, a geneticist at the University of Toronto who studies the role of the sequences in some diseases.

In the future, researchers may be less and less inclined to describe any of the noncoding sequences as junk because there are so many other more precise ways of labeling them now. For Sisu, the field’s best way forward is to keep an open mind when assessing the eccentricities of noncoding DNA and RNA and their biological importance. People should “take a step back and realize that one person’s trash is another person’s treasure,” she said.


The Cosmological Principle seems to be false

 

New Evidence against the Standard Model of Cosmology

Sabine Hossenfelder


http://backreaction.blogspot.com/2021/09/new-evidence-against-standard-model-of.html?utm_source=feedburner&utm_medium=email&utm_campaign=Feed%3A+blogspot%2Fermku+%28Backreaction%29

 

[[I remember when i read Steven Weinberg's The First Three Minutes in the 70s where he was careful to say that everything we know about the universe as a whole depends upon the Cosmological Principle, and Stephen Hawking said we believe it on the grounds of humility [!]. And now there is significant evidence that it is false! Keep that in mind when you read the next “discovery” reported in the nyt.]]

 

Physicists believe they understand quite well how the universe works on large scales. There’s dark matter and there’s dark energy, and there’s the expansion of the universe that allows matter to cool and clump and form galaxies. The key assumption to this model for the universe is the cosmological principle, according to which the universe is approximately the same everywhere. But increasingly more observations show that the universe just isn’t the same everywhere. What are those observations? Why are they a problem? And what does it mean? That’s what we’ll talk about today.

 

Let’s begin with the cosmological principle, the idea that the universe looks the same everywhere. Well. Of course the universe does not look the same everywhere. There’s more matter under your feet than above your head and more matter in the Milky way than in intergalactic space, and so on. Physicists have noticed that too, so the cosmological principle more precisely says that matter in the universe is equally distributed when you average over sufficiently large distances.

 

To see what this means, forget about matter for a moment and suppose you have a row of detectors and they measure, say, temperature. Each detector gives you a somewhat different temperature but you can average over those detectors by taking a few of them at a time, let’s say 5, calculate the average value from the reading of those five detectors, and replace the values of the individual detectors with their average value. You can then ask how far away this averaged distribution is from one that’s the same everywhere. In this example it’s pretty close.

 

But suppose you have a different distribution, for example this one. If you average over sets of 5 detectors again, the result still does not look the same everywhere. Now, if you average over all detectors, then of course the average is the same everywhere. So if you want to know how close a distribution is to being uniform, you average it over increasingly large distances and ask from what distance on it’s very similar to just being the same everywhere.

 

In cosmology we don’t want to average over temperatures, but we want to average over the density of matter. On short scales, which for cosmologists is something like the size of the milky way, matter clearly is not uniformly distributed. If we average over the whole universe, then the average is uniform, but that’s uninteresting. What we want to know is, if we average over increasingly large distances, at what distance does the distribution of matter become uniform to good accuracy?

 

Yes, good question. One can calculate this distance using the concordance model, which is the currently accepted standard model of cosmology. It’s also often called ΛCDM, where Λ is the cosmological constant and CDM stands for cold dark matter. The distance at which the cosmological principle should be a good approximation to the real distribution of matter was calculated from the concordance model in a 2010 paper by Hunt and Sarkar.

 

They found that the deviations from a uniform distribution fall below one part in a hundred from an averaging distance of about 200-300 Mpc on. 300 Megaparsec are about 1 billion light years. And just to give you a sense of scale, our distance to the next closest galaxy, Andromeda, is about two and a half million light years. A billion light years is huge. But from that distance on at the latest, the cosmological principle should be fulfilled to good accuracy – if the concordance model is correct.

 

One problem with the cosmological principle is that astrophysicists have on occasion assumed it is valid already on shorter distances, down to about 100 Megaparsec. This is an unjustified assumption, but it has for example entered the analysis of supernovae data from which the existence of dark energy was inferred. And yes, that’s what the Nobel Prize in physics was awarded for in 2011.

 

Two years ago, I told you about a paper by Subir Sarkar and his colleagues, that showed if one analyses the supernovae data correctly, without assuming that the cosmological principle holds on too short distances, then the evidence for dark energy disappears. That paper has been almost entirely ignored by other scientists. Check out my earlier video for more about that.

 

Today I want to tell you about another problem with the cosmological principle. As I said, one can calculate the scale from which on it should be valid from the standard model of cosmology. Beyond that scale, the universe should look pretty much the same everywhere. This means in particular there shouldn’t be any clumps of matter on scales larger than about a billion light years. But. Astrophysicists keep on finding those.

 

Already in nineteen-ninety-one they found the Clowes-Campusano-Quasar group, which is a collection of thirty-four Quasars, about nine point five Billion light years away from us and it extends over two Billion Light-years, clearly too large to be compatible with the prediction from the concordance model.

 

Since 2003 astrophysicists know the „great wall“ a collection of galaxies about a billion light years away from us that extends over 1.5 billion light years. That too, is larger than it should be.

 

Then there’s the “Huge quasar group” which is… huge. It spans a whopping four Billion light-years. And just in July Alexia Lopez discovered the “Giant Arc” a collection of galaxies, galaxy clusters, gas and dust that spans 3 billion light years.

 

Theoretically, these structures shouldn’t exist. It can happen that such clumps appear coincidentally in the concordance model. That’s because this model uses an initial distribution of matter in the early universe with random fluctuations. So it could happen you end up with a big clump somewhere just by chance. But you can calculate the probability for that to happen. The Giant Arc alone has a probability of less than one in a hundred-thousand to have come about by chance. And that doesn’t factor in all the other big structures.

 

What does it mean? It means the evidence is mounting that the cosmological principle is a bad assumption to develop a model for the entire universe and it probably has to go. It increasingly looks like we live in a region in the universe that happens to have a significantly lower density than the average in the visible universe. This area of underdensity which we live in has been called the “local hole”, and it has a diameter of at least 600 million light years. This is the finding of a recent paper by a group of astrophysicists from Durham in the UK.

 

They also point out that if we live in a local hole then this means that the local value of the Hubble rate must be corrected down. This would be good news because currently measurements for the local value of the Hubble rate are in conflict with the value from the early universe. And that discrepancy has been one of the biggest headaches in cosmology in the past years. Giving up the cosmological principle could solve that problem.

 

However, the finding in that paper from the Durham group is only a mild tension with the concordance model, at about three sigma, which is not highly statistically significant. But Sarkar and his group had another paper recently in which they do a consistency check on the concordance model and find a conflict at four point nine sigma, that is a less than one in a million chance for it to be coincidence.

 

This works as follows. If we measure the temperature of the cosmic microwave background, it appears hotter into the direction which we move against it. This gives rise to the so-called CMB dipole. You can measure this dipole. You can also measure the dipole by inferring our motion from the observations of quasars. If the concordance model was right, the direction and magnitude of the dipoles should be the same. But they are not. You see this in this figure from Sarkar’s paper. The star is the location of the cmb dipole, the triangle that of the quasar dipole. In this figure you see how far away from the cmb expectation the quasar result is.

 

These recent developments make me think that in the next ten years or so, we will see a major paradigm shift in cosmology, where the current standard model will be replaced with another one. Just what the new model will be, and if it will still have dark energy, I don’t know.