Thursday, January 16, 2014

Hands Off! This May Be Love


I am excited to announce the publication of Hands Off! This May Be Love, my first book written for a general audience.

Comprising exciting new material along with parts of my previous books, Hands Off! This May Be Love presents a Jewish, down-to-earth perspective on the hows and whys of refraining from physical contact before marriage, a practice that resonates with many religious Christians and others troubled by today’s societal norms. We are hoping that it will find its way into the hands not only of young adults, but also parents, youth pastors and other leaders. This book can make a real difference in people's lives and will also, God willing, achieve tremendous kiddush Hashem.

If you know anyone who may be open to the message of Hands Off! This May Be Love, please pass this email along to them. And if you have any ideas or connections for getting this book out there, please let me know!


And here's a link for the ebook:


Thanks so much!

With blessings,

Gila Manolson

Wednesday, January 15, 2014

Salty surprise - ordinary table salt turns into "forbidden" forms


[Suggested reading for a student who recently told me about what has been proved by peer-reviewed science.] 



High-pressure X-ray experiments violate textbook rules of chemistry

High-pressure experiments with ordinary table salt have produced new chemical compounds that should not exist according to the textbook rules of chemistry. The study at DESY's X-ray source PETRA III and at other research centres could pave the way to a more universal understanding of chemistry and to novel applications, as the international research team, led by Prof. Artem Oganov of Stony Brook University (State University of New York) and Prof. Alexander Goncharov of Carnegie Institution, reports in the scientific journal Science.
Zoom (17 KB)
The electron localization function in the cubic NaCl3 structure. Credit: Artem Oganov/Stony Brook University 
Table salt, also known as sodium chloride or NaCl, is one of the best-known and most studied chemical compounds. It crystalises in a cubic unit cell and is very stable. Its chemical composition is simple - one sodium atom (Na) and one chlorine atom (Cl). Or at least that's true under ambient conditions. Other compounds of the two elements are forbidden by the classical rules of chemistry. For instance, according to the octet rule all chemical elements strive to fill their outermost shell with eight electrons, which is the most stable configuration, found in noble gases. Sodium has one extra electron and chlorine is missing one, so sodium donates one electron to chlorine, leaving both atoms with an outer shell containing eight electrons and forming a strong ionic bond.
But when the scientists put table salt under high pressure of 200,000 atmospheres and more at PETRA III and added an extra dash of either sodium or chlorine, "forbidden" compounds like Na3Cl und NaCl3 turned up. "Following the theoretical prediction, we heated the samples under pressure with lasers for a while," explains co-author Dr. Zuzana Konôpková of DESY, who supported the experiments at DESY's Extreme Conditions Beamline P02 (ECB). "We found other stable compounds of Na and Cl which came as a surprise." This is not supposed to happen, as these compounds require a completely different form of chemical bonding with higher energy, and nature always favours the lowest state of energy.
But Oganov's team had calculated before that exotic compounds might form under extreme conditions and remain stable under these conditions. “We have predicted and made crazy compounds that violate textbook rules: NaCl3, NaCl7, Na3Cl2, Na2Cl, and Na3Cl,” says Dr. Weiwei Zhang, the lead author of the paper and a visiting scholar at Oganov's lab at Stony Brook. At PETRA III and at Carnegie Institution the scientists tested the predictions in what they call "cook and look" experiments, targeting Na3Cl and NaCl3, the two compounds that were predicted to be more easily made than others, and indeed found them. “These compounds are thermodynamically stable and once made, remain so indefinitely," says Zhang. "Classical chemistry forbids their very existence. Classical chemistry also says atoms try to fulfil the octet rule - elements gain or lose electrons to attain an electron configuration of the nearest noble gas, with complete outer electron shells that make them very stable. Well, here that rule is not satisfied.”
The experiments help to explore a broader view of chemistry. “I think this work is the beginning of a revolution in chemistry,” Oganov says. “We found, at low pressures achievable in the lab, perfectly stable compounds that contradict the classical rules of chemistry. If you apply rather modest pressure, 200,000 atmospheres – for comparison purposes, the pressure at the centre of the Earth is 3.6 million atmospheres – much of what we know from chemistry textbooks falls apart.”
One reason for the surprising discovery is that textbook chemistry usually applies to what we call ambient conditions. "Here on the surface of the earth, these conditions might be default, but they are rather special if you look at the universe as a whole," Konôpková explains. What may be "forbidden" under ambient conditions on earth, can become possible under more extreme conditions. "'Impossible' really means that the energy is going to be high," Oganov says. "The rules of chemistry are not like mathematical theorems, which cannot be broken. The rules of chemistry can be broken, because impossible means softly impossible. You just need to find the conditions where the energy balance shifts and the rules hold no more."
Apart from its fundamental meaning, the discovery can also produce new practical applications. “When you change the theoretical underpinnings of chemistry, that’s a big deal,” Goncharov says. “But what it also means is that we can make new materials with exotic properties.” Among the compounds Oganov and his team created are two-dimensional metals, where electricity is conducted along the layers of the structure. “One of these materials – Na3Cl – has a fascinating structure,” Oganov says. “It is comprised of layers of NaCl and layers of pure sodium. The NaCl layers act as insulators; the pure sodium layers conduct electricity. Systems with two-dimensional electrical conductivity have attracted a lot interest.”
The experiments with table salt might only be the beginning of the discovery of completely new compounds. “If this simple system is capable of turning into such a diverse array of compounds under high-pressure conditions, then others likely are, too,” Goncharov explains. “This could help answer outstanding questions about early planetary cores, as well as to create new materials with practical uses.”

Reference
“Unexpected stable stoichometries of sodium chloride”; Weiwei Zhang, Artem R. Oganov, Alexander F. Goncharov, Qiang Zhu, Salah Eddine Boulfelfel, Andriy O. Lyakhov, Elissaios Stavrou, Maddury Somayazulu, Vitali B. Prakapenka, Zuzana Konôpková; Science, 2013; DOI:10.1126/science.1244989

Monday, January 6, 2014

Scientists discover double meaning in genetic code




A team of researchers have made the discovery that the genetic code used by DNA to store information actually contains a double meaning, with the second set of coded information having major implications for how scientists read and interpret the instructions within it.  According to the authors of this study, this fascinating find could help them to better understand both disease and health.
DNA, or deoxyribonucleic acid, is present in the cells of all humans, as well as almost every other living organism.  It contains the information for building and maintaining an organism in the form of a chemical code.  Four basic chemical bases – adenine, guanine, cytosine and thymine – are strung together in various sequences; and, the sequencing of these bases is what determines what information is coded within them, much like letters of the alphabet can be put together to create many different words and sentences.
The overall structure of DNA is what is known as a double helix.  The bases of DNA pair up with each other, with adenine pairing with thymine and cytosine pairing with guanine.  Each base pair then attaches to a sugar molecule as well as a phosphate molecule and this complete package is called a nucleotide.  Sequences of nucleotides arrange themselves in two long strands, somewhat like a ladder joined by base pair rungs, and the DNA molecule takes on a spiraling, helical shape.
DNA is able to make copies of itself by “unzipping” its two strands, allowing each strand to serve as a template for new DNA to be formed.  This process is how new DNA is created whenever cells divide and multiply.
Since 1962, when James Watson and Francis Crick received the Nobel Prize in Physiology or Medicine for discovering DNA, it has been thought that this was all there was to know about how DNA worked.  However, a research team lead by Dr. John Stamatoyannopoulos of the University of Washington, has made a startling new discovery.  DNA is actually used to write in two different languages, giving a double meaning to the genetic code.
One of these languages, the one that was discovered by Watson and Crick, is used to code information about proteins.   The second one, which was just discovered, codes information which tells the cell how to control genes.  Genes are sections of DNA molecules which, when taken by themselves, code for specific proteins.  Humans have thousands of genes, all of them controlling different traits, such as eye color or height.
It took scientists a long time to locate this second language because one language is superimposed over the other one.
Speaking about this new find,  Stamatoyannopoulos note that “[t]hese new findings highlight that DNA is an incredibly powerful information storage device, which nature has fully exploited in unexpected ways.”
According to the UW team, the genetic code uses a 64-letter “alphabet” called codons.  What they discovered in their work was that some of these codons actually had two different meanings, one which affected protein sequencing and one which affected gene control.  These codons, which they call duons, seem to have evolved these double meanings in order to help stabilize certain beneficial features of proteins and their manufacture.
Their findings have important implications for how scientists interpret a person’s genome, they say, opening up new ways to diagnose and treat disease.  Because the genetic code is communicating two different types of information at the same time, diseases which appear to be the result of alterations in protein sequencing might actually be caused by changes in gene control programs, or even both factors.
The findings from the study were published in the December 13, 2013 issue of Science.
By Nancy Schimelpfening
Sources:


If Chemistry Can Be Wrong, How Much More Evolutionary Theory?


salt-320-use-this-one.jpgAlong with astronomy, chemistry is one of the ancient sciences. Progressing from alchemy to rational chemistry, physical chemistry and quantum mechanics of our day, its status as "hard science" seems secure. Its theories have been refined for centuries. Moreover, its experiments (unlike macroevolution) are observable and repeatable. How, then, could researchers at Stony Brook University (academic home, by the way, of ENV contributor Dr. Michael Egnor, Vice-Chairman, Department of Neurological Surgery) say that a discovery has challenged the foundation of chemistry?
Experiments at the lab are causing a stir -- if not a revolution -- in this hallowed science:
All good research breaks new ground, but rarely does the research unearth truths that challenge the foundation of a science. That's what Artem R. Oganov has done, and the professor of theoretical crystallography in the Department of Geosciences will have his work published in the Dec. 20 issue of the journal Science....
"I think this work is the beginning of a revolution in chemistry," Oganov says. "We found, at low pressures achievable in the lab, perfectly stable compounds that contradict the classical rules of chemistry. If you apply the rather modest pressure of 200,000 atmospheres -- for comparison purposes, the pressure at the center of the earth is 3.6 million atmospheres -- everything we know from chemistry textbooks falls apart." [Emphasis added.]
We all know NaCl, table salt. Ever heard of NaCl3? How about Na3Cl? Those are some of the novel compounds Oganov's lab has produced. They were stunned to find them to be stable, real compounds.
"We found crazy compounds that violate textbook rules -- NaCl3, NaCl7, Na3Cl2, Na2Cl, and Na3Cl," says Weiwei Zhang, the lead author and visiting scholar at the Oganov lab and Stony Brook's Center for Materials by Design, directed by Oganov. "These compounds are thermodynamically stable and, once made, remain indefinitely; nothing will make them fall apart. Classical chemistry forbids their very existence. Classical chemistry also says atoms try to fulfill the octet rule -- elements gain or lose electrons to attain an electron configuration of the nearest noble gas, with complete outer electron shells that make them very stable. Well, here that rule is not satisfied."
The discoveries open up new possibilities for the supposedly mature science. "When you change the theoretical underpinnings of chemistry, that's a big deal," one team member says. "But what it also means is that we can make new materials with exotic properties."
How could this happen in a "hard" science? For one, Oganov, described as curious and obstinate, was driven by the word "impossible" to explore its limits.
To Oganov, impossible didn't mean something absolute. "The rules of chemistry are not like mathematical theorems, which cannot be broken," he says. "The rules of chemistry can be broken, because impossible only means 'softly' impossible! You just need to find conditions where these rules no longer hold."
Oganov compares the team's findings to discovering a new continent. Understanding and predicting high-pressure compounds can lead to new theories. And he envisions applications for astrophysics and planetary sciences, where high pressures abound.
If an unexpected foundation-shaking paradigm shift can occur in a "hard" science like chemistry, where findings can be checked by observation and experiment, how confident can evolutionists be that their theories about the unobservable past?
In recent years, major problems have surfaced in evolutionary theory: the overthrow of "junk DNA," the discovery of codes within codes, the intransigence of the Cambrian enigma to name a few. Yet its advocates continue to bully anyone who doesn't toe the line. Darwinism acts like a religion, not science. If Darwinists were proper scientists, they would embrace the new discoveries that break their rules. They would gladly follow the mounting evidence that points in a new direction for the biology of the 21st century -- intelligent design.
Image credit: Electron localization function in the cubic NaCl3 structure/Stony Brook University.
- See more at: http://www.evolutionnews.org/2014/01/if_chemistry_ca080711.html#sthash.fPEpn8OB.dpuf

Wednesday, December 11, 2013

Randy Wayne Schekman (born December 30, 1948) is an American cell biologist at the University of California, Berkeley,[6] and former editor-in-chief of Proceedings of the National Academy of Sciences.[2][7][8] In 2011 he was announced as the editor of eLife, a new high profile open access journal published by the Howard Hughes Medical Institute, the Max Planck Society and the Wellcome Trust launching in 2012.[9] He was elected to the National Academy of Sciences in 1992.[10] Schekman shared the 2013 Nobel Prize for Physiology or Medicine with James Rothman and Thomas C. Südhof.[5][11]

How journals like Nature, Cell and Science are damaging science | Randy Schekman
http://www.theguardian.com/commentisfree/2013/dec/09/how-journals-nature-science-cell-damage-science

I am a scientist. Mine is a professional world that achieves great things for humanity. But it is disfigured by inappropriate incentives. The prevailing structures of personal reputation and career advancement mean the biggest rewards often follow the flashiest work, not the best. Those of us who follow these incentives are being entirely rational – I have followed them myself – but we do not always best serve our profession's interests, let alone those of humanity and society.
We all know what distorting incentives have done to finance and banking. The incentives my colleagues face are not huge bonuses, but the professional rewards that accompany publication in prestigious journals – chiefly NatureCell and Science.
These luxury journals are supposed to be the epitome of quality, publishing only the best research. Because funding and appointment panels often use place of publication as a proxy for quality of science, appearing in these titles often leads to grants and professorships. But the big journals' reputations are only partly warranted. While they publish many outstanding papers, they do not publish only outstanding papers. Neither are they the only publishers of outstanding research.
These journals aggressively curate their brands, in ways more conducive to selling subscriptions than to stimulating the most important research. Like fashion designers who create limited-edition handbags or suits, they know scarcity stokes demand, so they artificially restrict the number of papers they accept. The exclusive brands are then marketed with a gimmick called "impact factor" – a score for each journal, measuring the number of times its papers are cited by subsequent research. Better papers, the theory goes, are cited more often, so better journals boast higher scores. Yet it is a deeply flawed measure, pursuing which has become an end in itself – and is as damaging to science as the bonus culture is to banking.
It is common, and encouraged by many journals, for research to be judged by the impact factor of the journal that publishes it. But as a journal's score is an average, it says little about the quality of any individual piece of research. What is more, citation is sometimes, but not always, linked to quality. A paper can become highly cited because it is good science – or because it is eye-catching, provocative or wrong. Luxury-journal editors know this, so they accept papers that will make waves because they explore sexy subjects or make challenging claims. This influences the science that scientists do. It builds bubbles in fashionable fields where researchers can make the bold claims these journals want, while discouraging other important work, such as replication studies.
In extreme cases, the lure of the luxury journal can encourage the cutting of corners, and contribute to the escalating number of papers that are retracted as flawed or fraudulent. Science alone has recently retracted high-profile papers reporting cloned human embryos, links between littering and violence, and the genetic profiles of centenarians. Perhaps worse, it has not retracted claims that a microbe is able to use arsenic in its DNA instead of phosphorus, despite overwhelming scientific criticism.
There is a better way, through the new breed of open-access journals that are free for anybody to read, and have no expensive subscriptions to promote. Born on the web, they can accept all papers that meet quality standards, with no artificial caps. Many are edited by working scientists, who can assess the worth of papers without regard for citations. As I know from my editorship of eLife, an open access journal funded by the Wellcome Trust, the Howard Hughes Medical Institute and the Max Planck Society, they are publishing world-class science every week.
Funders and universities, too, have a role to play. They must tell the committees that decide on grants and positions not to judge papers by where they are published. It is the quality of the science, not the journal's brand, that matters. Most importantly of all, we scientists need to take action. Like many successful researchers, I have published in the big brands, including the papers that won me the Nobel prize for medicine, which I will be honoured to collect tomorrow.. But no longer. I have now committed my lab to avoiding luxury journals, and I encourage others to do likewise.
Just as Wall Street needs to break the hold of the bonus culture, which drives risk-taking that is rational for individuals but damaging to the financial system, so science must break the tyranny of the luxury journals. The result will be better research that better serves science and society.

Monday, October 28, 2013



Problems with scientific research
How science goes wrong




Scientific research has changed the world. Now it needs to change itself
Oct 19th 2013 |From the print edition

[[There are a lot of articles with this subject, but due to the deserved reputation of the Economist I am putting this one on the blog. D. G.]]
·                                  
·                                 
A SIMPLE idea underpins science: “trust, but verify”. Results should always be subject to challenge from experiment. That simple but powerful idea has generated a vast body of knowledge. Since its birth in the 17th century, modern science has changed the world beyond recognition, and overwhelmingly for the better.
But success can breed complacency. Modern scientists are doing too much trusting and not enough verifying—to the detriment of the whole of science, and of humanity.
In this section
·                                 How science goes wrong
Too many of the findings that fill the academic ether are the result of shoddy experiments or poor analysis (see article). A rule of thumb among biotechnology venture-capitalists is that half of published research cannot be replicated. Even that may be optimistic. Last year researchers at one biotech firm, Amgen, found they could reproduce just six of 53 “landmark” studies in cancer research. Earlier, a group at Bayer, a drug company, managed to repeat just a quarter of 67 similarly important papers. A leading computer scientist frets that three-quarters of papers in his subfield are bunk. In 2000-10 roughly 80,000 patients took part in clinical trials based on research that was later retracted because of mistakes or improprieties.
What a load of rubbish
Even when flawed research does not put people’s lives at risk—and much of it is too far from the market to do so—it squanders money and the efforts of some of the world’s best minds. The opportunity costs of stymied progress are hard to quantify, but they are likely to be vast. And they could be rising.
One reason is the competitiveness of science. In the 1950s, when modern academic research took shape after its successes in the second world war, it was still a rarefied pastime. The entire club of scientists numbered a few hundred thousand. As their ranks have swelled, to 6m-7m active researchers on the latest reckoning, scientists have lost their taste for self-policing and quality control. The obligation to “publish or perish” has come to rule over academic life. Competition for jobs is cut-throat. Full professors in America earned on average $135,000 in 2012—more than judges did. Every year six freshly minted PhDs vie for every academic post. Nowadays verification (the replication of other people’s results) does little to advance a researcher’s career. And without verification, dubious findings live on to mislead.
Careerism also encourages exaggeration and the cherry-picking of results. In order to safeguard their exclusivity, the leading journals impose high rejection rates: in excess of 90% of submitted manuscripts. The most striking findings have the greatest chance of making it onto the page. Little wonder that one in three researchers knows of a colleague who has pepped up a paper by, say, excluding inconvenient data from results “based on a gut feeling”. And as more research teams around the world work on a problem, the odds shorten that at least one will fall prey to an honest confusion between the sweet signal of a genuine discovery and a freak of the statistical noise. Such spurious correlations are often recorded in journals eager for startling papers. If they touch on drinking wine, going senile or letting children play video games, they may well command the front pages of newspapers, too.
Conversely, failures to prove a hypothesis are rarely even offered for publication, let alone accepted. “Negative results” now account for only 14% of published papers, down from 30% in 1990. Yet knowing what is false is as important to science as knowing what is true. The failure to report failures means that researchers waste money and effort exploring blind alleys already investigated by other scientists.
The hallowed process of peer review is not all it is cracked up to be, either. When a prominent medical journal ran research past other experts in the field, it found that most of the reviewers failed to spot mistakes it had deliberately inserted into papers, even after being told they were being tested.
If it’s broke, fix it
All this makes a shaky foundation for an enterprise dedicated to discovering the truth about the world. What might be done to shore it up? One priority should be for all disciplines to follow the example of those that have done most to tighten standards. A start would be getting to grips with statistics, especially in the growing number of fields that sift through untold oodles of data looking for patterns. Geneticists have done this, and turned an early torrent of specious results from genome sequencing into a trickle of truly significant ones.
Ideally, research protocols should be registered in advance and monitored in virtual notebooks. This would curb the temptation to fiddle with the experiment’s design midstream so as to make the results look more substantial than they are. (It is already meant to happen in clinical trials of drugs, but compliance is patchy.) Where possible, trial data also should be open for other researchers to inspect and test.
The most enlightened journals are already becoming less averse to humdrum papers. Some government funding agencies, including America’s National Institutes of Health, which dish out $30 billion on research each year, are working out how best to encourage replication. And growing numbers of scientists, especially young ones, understand statistics. But these trends need to go much further. Journals should allocate space for “uninteresting” work, and grant-givers should set aside money to pay for it. Peer review should be tightened—or perhaps dispensed with altogether, in favour of post-publication evaluation in the form of appended comments. That system has worked well in recent years in physics and mathematics. Lastly, policymakers should ensure that institutions using public money also respect the rules.
Science still commands enormous—if sometimes bemused—respect. But its privileged status is founded on the capacity to be right most of the time and to correct its mistakes when it gets things wrong. And it is not as if the universe is short of genuine mysteries to keep generations of scientists hard at work. The false trails laid down by shoddy research are an unforgivable barrier to understanding.


Sunday, October 6, 2013





Evolution, Speeded by Computation

‘Probably Approximately Correct’ Explores Nature’s Algorithms

  • FACEBOOK
  • TWITTER
  • GOOGLE+
  • SAVE
  • E-MAIL
  • SHARE
  • PRINT
  • REPRINTS
Our daily lives are growing ever more dependent on algorithms, those omnipresent computational procedures that run programs on our laptops, our smartphones, our GPS devices and much else. Algorithms influence our decisions, too: when we choose a movie on Netflix or a book on Amazon, we are presented with recommendations churned out by sophisticated algorithms that take into account our past choices and those of other users determined (by still other algorithms) to be similar to us.

The importance of these algorithms in the modern world is common knowledge, of course. But in his insightful new book “Probably Approximately Correct,” the Harvard computer scientist Leslie Valiant goes much further: computation, he says, is and has always been “the dominating force on earth within all its life forms.” Nature speaks in algorithms.
Dr. Valiant believes that phenomena like evolution, adaptation and learning are best understood in terms of “ecorithms,” a term he has coined for algorithms that interact with and benefit from their environment. Ecorithms are at play when children learn how to distinguish cats from dogs, when we navigate our way in a new city — but more than that, Dr. Valiant writes, when organisms evolve and when brain circuits are created and maintained.
Here is one way he illustrates this complex idea. Suppose we want to distinguish between two types of flowers by measuring their petals. For each petal we have two numbers, x for length and y for width. The task is to find a way to tell which type of flower a petal with given measurements x and y belongs to.
To achieve this, the algorithm is fed a set of examples meant to “train” it to come up with a good criterion for distinguishing the two flowers. The algorithm does not know the criterion in advance; it must “learn” it using the data that are fed to it.
So it starts with a hypothesis and tests it on the first example. (Say flower No. 1’s petals can be described by the formula 2x—3y>2, while for Flower No. 2 it’s 2x—3y<2 .="" a="" algorithm="" along.="" an="" and="" applied="" as="" by="" certain="" example.="" example="" goes="" hypothesis="" if="" is="" it="" learning="" literally="" misclassifies="" new="" next="" p="" precise="" proceed="" rule="" that="" the="" to="" updated="" we="" works="">
A striking mathematical theorem is that if a rule separating the two flowers exists (within the class of criteria we are considering, such as linear inequalities), then our algorithm will find it after a finite number of steps, no matter what the starting hypothesis was.
And Dr. Valiant argues that similar mechanisms are at work in nature. An organism can adapt to a new environment by starting with a hypothesis about the environment, then testing it against new data and, based on the feedback, gradually improving the hypothesis by using an ecorithm, to behave more effectively.
“Probably Approximately Correct,” Dr. Valiant’s winsome title, is his quantitative framework for understanding how these ecorithms work. In nature, there is nothing so neat as our idealized flower algorithm; we cannot really hope to get a precise rule distinguishing between two types of flowers, but can hope only to have one that gives an approximate result with high probability.
The evolution of species, as Darwin taught us, relies on natural selection. But Dr. Valiant argues that if all the mutations that drive evolution were simply random and equally distributed, it would proceed at an impossibly slow and inefficient pace.
Darwin’s theory “has the gaping gap that it can make no quantitative predictions as far as the number of generations needed for the evolution of a behavior of a certain complexity,” he writes. “We need to explain how evolution is possible at all, how we got from no life, or from very simple life, to life as complex as we find it on earth today. This is the BIG question.”
[[Shades of Thomas Nagel!! - D.G.]]
Dr. Valiant proposes that natural selection is supplemented by ecorithms, which enable organisms to learn and adapt more efficiently. Not all mutations are realized with equal probability; those that are more beneficial are more likely to occur. In other words, evolution is accelerated by computation.
This is an ambitious proposal, sure to ignite controversy. But what I find so appealing about this discussion, and the book in general, is that Dr. Valiant fearlessly goes to the heart of the “BIG” questions. (I don’t know him, though we share an editor at Basic Books.) He passionately argues his case, but is also first to point out the parts of his theory that are incomplete, eager to anticipate and confront possible counterarguments. This is science at its best, driven not by dogma and blind belief, but by the desire to understand, intellectual integrity and reliance on facts.
Many other topics are discussed, from computational complexity to intelligence (artificial and not), and even an algorithmic perspective on teaching. The
book is written in a lively, accessible style and is surprisingly entertaining. It’s funny how your perception of even mundane tasks can change after reading it — you start thinking algorithmically, confirming Dr. Valiant’s maxim that computer science is “more about humans than about computers.”
Edward Frenkel, a professor of mathematics at the University of California, Berkeley, is the author of the new book “Love and Math.”