Friday, December 4, 2009

Religion Beats the Throw

New Research Says: Trust Your Subconscious Wiring

Humans don't make very good decisions. This is clear from the Nobel-prize winning work of Kahneman and Tversky, or to anyone who's spent any time with any humans (including themselves) ever.Now recent work at the University of Rochester confirms that the only sections of your skull you can trust are subconscious.

It's important to remember that your brain, the embodiment of everything you are and the most amazing computation device ever constructed, is a hot-wired adaptation which makes the average MacGuyver gadget look like ten years of planning with a federal budget. Your skull-meats were intended to help you club things smaller than you to death and eat them, full stop, and the fact we've reconfigured them to do a million other things up to and including building and playing pianos is nothing short of astonishing.

All the original functions work well. Things like "what's going left", "is that a bad thing" and "where do I move to intercept it" have been shown to work far better than the higher functions - someone who couldn't solve parabolic equations with drag can still catch a ball. Humans are very good at recognizing imminent danger (is that a hungry saber-tooth tiger?) but almost catastrophically bad at the abstract (should I take out a huge mortgage that I have no ability to pay?)

Professor Pouget has studied this reliable sub-conscious wiring, by directly observing neurons responsible for identifying motion to the left or right while the subject observed a collection of moving dotes. The firing of these neurons increases until, when it becomes continuous, the person suddenly "realises" the answer - once the brain has finished its processing, it hands the answer to the waking mind fully formed.

This explains an awful lot about modern society - the underbrain can easily identify physically moving left or right, but once the higher brain is asked to deal with things being politically left or right it all gets messed up.

Posted by Casey Kazan.

Subconscious study http://www.physorg.com/news149345120.html

My Comment: How many different ways has this been said by how many different cultures? This is an example of one of the classic critiques of western science, that they completely disregarded ancient wisdom as if it never existed. It was possible to simply put ancient axioms to a more rigorous test, to at least keep the questions open. But no, the ancients couldn’t have possibly known anything about life, consciousness, or the world. As a result, those waiting on science to discover essential truths have to wait hundreds of years before the ancient axioms can be said just so by just the right branch of science. I’ve always been partial to the phrasing, learn to trust your heart. This might take another hundred years or so for biologists involved in cardiac research to discover that the heart has “brainlike” qualities.

Tuesday, December 1, 2009

Science and Religion Both Land on Park Place

Stars Form At Record Speeds In Infant Galaxy

ScienceDaily (Feb. 7, 2009) — When galaxies are born, do their stars form everywhere at once, or only within a small core region? Recent measurements of an international team led by scientists from the Max Planck Institute for Astronomy provide the first concrete evidence that star-forming regions in infant galaxies are indeed small - but also hyperactive, producing stars at astonishingly high rates.

Galaxies, including our own Milky Way, consist of hundreds of billions of stars. How did such gigantic galactic systems come into being? Did a central region with stars first form then with time grow? Or did the stars form at the same time throughout the entire galaxy? An international team led by researchers from the Max Planck Institute for Astronomy is now much closer to being able to answer these questions.

The researchers studied one of the most distant known galaxies, a so-called quasar with the designation J1148+5251. Light from this galaxy takes 12.8 billion years to reach Earth; in turn, astronomical observations show the galaxy as it appeared 12.8 billion years ago, providing a glimpse of the very early stages of galactic evolution, less than a billion years after the Big Bang.

With the IRAM Interferometer, a German-French-Spanish radio telescope, the researchers were able to obtain images of a very special kind: they recorded the infrared radiation emitted by J1148+5251 at a specific frequency associated with ionized carbon atoms, which is a reliable indicator of ongoing star formation.

The resulting images show sufficient detail to allow, for the first time, the measurement of the size of a very early star-forming region. With this information, the researchers were able to conclude that, at that time, stars were forming in the core region of J1148+5251 at record rates - any faster and star formation would have been in conflict with the laws of physics.

"This galaxy's rate of star production is simply astonishing," says the article's lead author, Fabian Walter of the Max Planck Institute for Astronomy. "Every year, this galaxy's central region produces new stars with the combined mass of more than a thousand suns." By contrast, the rate of star formation within our own galaxy, the Milky Way, is roughly one solar mass per year.

Close to the physical limit

It has been known for some time that young galaxies can produce impressive amounts of new stars, but overall activity is only part of the picture. Without knowing the star-forming region's size, it is impossible to compare star formation in early galaxies with theoretical models, or with star-forming regions in our own galaxy.

With a diameter of a mere 4000 light-years (by comparison: the Milky Way galaxy's diameter amounts to 100,000 light-years), the star-forming core of J1148+5251 is extremely productive. In fact, it is close to the limits imposed by physical law. Stars are formed when cosmic clouds of gas and dust collapse under their own gravity. As the clouds collapse, temperatures rise, and internal pressure starts to build. Once that pressure has reached certain levels, all further collapse is brought to a halt, and no additional stars can form. The result is an upper limit on how many stars can form in a given volume of space in a given period of time.

Remarkably, the star-forming core of J1148+5251 reaches this absolute limit. This extreme level of activity can be found in parts of our own galaxy, but only on much smaller scales. For example, there is a region within the Orion nebula (Fig. 2) that is just as active as what we have observed. Fabian Walter: "But in J1148+5251, we are dealing with what amounts to a hundred million of these smaller regions combined!" Earlier observations of different galaxies had suggested an upper limit that amounts to a tenth of the value now observed in J1148+5251.

Growth from within

The compact star-forming region of J1148+5251 provides a highly interesting data point for researchers modelling the evolution of young galaxies. Going by this example, galaxies grow from within: in the early stages of star formation, there is a core region in which stars form very quickly. Presumably, such core regions grow over time, mainly as a result of collisions and mergers between galaxies, resulting in the significantly larger star-filled volume of mature galaxies.

The key to these results is one novel measurement: the first resolved image of an extremely distant quasar's star-forming central region, clearly showing the region's apparent diameter, and thus its size. This measurement is quite a challenge in itself. At a distance of almost 13 billion light-years (corresponding to a red-shift z = 6.42), the star-forming region, with its diameter of 4000 light-years, has an angular diameter of 0.27 seconds of arc - the size of a one euro coin, viewed at a distance of roughly 18 kilometres (or a pound coin, viewed at a distance of roughly 11 miles).

There is one further handicap: the observations rely on electromagnetic radiation with a characteristic wavelength, which is associated with ionized carbon atoms. At this wavelength, the star-forming regions of J1148+5251 outshine even the quasar's ultra-bright core. Due to the fact that the universe is expanding, the radiation is shifted towards longer wavelengths as it travels towards Earth ("cosmological redshift"), reaching our planet in the form of radio waves with a wavelength of about one millimetre. But, owing to the general nature of waves, it is more than a thousand times more difficult to resolve minute details at a wavelength of one millimetre, compared with visible light.

Observations at the required wavelength and level of detail became possible only as recently as 2006, thanks to an upgrade of the IRAM Interferometer, a compound radio telescope on the Plateau de Bure in the French Alps.

Future telescopes

Use of the characteristic radiation of ionized carbon to detect and create images of star-forming regions of extremely distant astronomical objects had been suggested some time ago. A significant portion of the observational program for ALMA, a compound radio telescope currently under construction in Northern Chile, relies on this observational approach. But up until the measurements of Fabian Walter and his colleagues, this technique had not been demonstrated in practice. Quoting Walter: "The early stages of galaxy evolution, roughly a billion years after the Big Bang, will be a major area of study for years to come. Our measurements open up a new window on star-forming regions in very young galaxies".

My Comment: This is one of those moments that is a bit uncomfortable for both scientists and religious people, when their questions about the world are identical. For Jews, all of this data is quite helpful because the passages describing creation are, for us, quite complex. How complex? Well, Nachmanides interpreted the first seven days with a description that sounds remarkably like the Big Bang. That is, what is written down in the Torah, the words, are distantly related to the actual meaning. Another question, given the above data about this particular galaxy, is whether time is the same out there. The implication is that it is not—since the Western concept of time is tied to the pace of planet and star revolution and formation. Here again religion and science occupy the same spot on the gameboard—as the Torah strongly hints at differences in time during different epochs of the earth. In other words, time is not uniform. But mostly, the ultimate questions of science and religion, at this stage of history, are pretty much the same. What kind of universe, by what universal laws are we actually governed?

Friday, November 20, 2009

The Most Important Religious Question

Magnetic Portals Connect Sun And Earth

ScienceDaily (Nov. 2, 2008) — During the time it takes you to read this article, something will happen high overhead that until recently many scientists didn't believe in. A magnetic portal will open, linking Earth to the sun 93 million miles away. Tons of high-energy particles may flow through the opening before it closes again, around the time you reach the end of the page.

"It's called a flux transfer event or 'FTE,'" says space physicist David Sibeck of the Goddard Space Flight Center. "Ten years ago I was pretty sure they didn't exist, but now the evidence is incontrovertible."

Indeed, today Sibeck is telling an international assembly of space physicists at the 2008 Plasma Workshop in Huntsville, Alabama, that FTEs are not just common, but possibly twice as common as anyone had ever imagined.

Researchers have long known that the Earth and sun must be connected. Earth's magnetosphere (the magnetic bubble that surrounds our planet) is filled with particles from the sun that arrive via the solar wind and penetrate the planet's magnetic defenses. They enter by following magnetic field lines that can be traced from terra firma all the way back to the sun's atmosphere.

"We used to think the connection was permanent and that solar wind could trickle into the near-Earth environment anytime the wind was active," says Sibeck. "We were wrong. The connections are not steady at all. They are often brief, bursty and very dynamic."

Several speakers at the Workshop have outlined how FTEs form: On the dayside of Earth (the side closest to the sun), Earth's magnetic field presses against the sun's magnetic field. Approximately every eight minutes, the two fields briefly merge or "reconnect," forming a portal through which particles can flow. The portal takes the form of a magnetic cylinder about as wide as Earth. The European Space Agency's fleet of four Cluster spacecraft and NASA's five THEMIS probes have flown through and surrounded these cylinders, measuring their dimensions and sensing the particles that shoot through. "They're real," says Sibeck.

Now that Cluster and THEMIS have directly sampled FTEs, theorists can use those measurements to simulate FTEs in their computers and predict how they might behave. Space physicist Jimmy Raeder of the University of New Hampshire presented one such simulation at the Workshop. He told his colleagues that the cylindrical portals tend to form above Earth's equator and then roll over Earth's winter pole. In December, FTEs roll over the north pole; in July they roll over the south pole.

Sibeck believes this is happening twice as often as previously thought. "I think there are two varieties of FTEs: active and passive." Active FTEs are magnetic cylinders that allow particles to flow through rather easily; they are important conduits of energy for Earth's magnetosphere. Passive FTEs are magnetic cylinders that offer more resistance; their internal structure does not admit such an easy flow of particles and fields. (For experts: Active FTEs form at equatorial latitudes when the IMF tips south; passive FTEs form at higher latitudes when the IMF tips north.) Sibeck has calculated the properties of passive FTEs and he is encouraging his colleagues to hunt for signs of them in data from THEMIS and Cluster. "Passive FTEs may not be very important, but until we know more about them we can't be sure."

There are many unanswered questions: Why do the portals form every 8 minutes? How do magnetic fields inside the cylinder twist and coil? "We're doing some heavy thinking about this at the Workshop," says Sibeck.

Meanwhile, high above your head, a new portal is opening, connecting your planet to the sun.

My Comment: This question is always overlooked, but I find it to be the most important religious question—that is, what universe do we live in? How does the universe function, what are the rules? The scientists are immensely helpful in answering this, because they always surprise themselves and us along the way. Right now, the puzzle pieces are falling on the card table, eventually some of us might see the picture. That picture, if it’s accurate, will then form the foundation of religious understanding.

Tuesday, November 17, 2009

Science Contributes Vital Information to the Religious

Skeleton Is An Endocrine Organ, Crucial To Regulating Energy Metabolism

ScienceDaily (Aug. 10, 2007) — Bones are typically thought of as calcified, inert structures, but researchers at Columbia University Medical Center have now identified a surprising and critically important novel function of the skeleton. They've shown for the first time that the skeleton is an endocrine organ that helps control our sugar metabolism and weight and, as such, is a major determinant of the development of type 2 diabetes.

The research, published in the August 10 issue of Cell, demonstrates that bone cells release a hormone called osteocalcin, which controls the regulation of blood sugar (glucose) and fat deposition through synergistic mechanisms previously not recognized. Usually, an increase in insulin secretion is accompanied by a decrease in insulin sensitivity. Osteocalcin, however, increases both the secretion and sensitivity of insulin, in addition to boosting the number of insulin-producing cells and reducing stores of fat.

In this published research, authors show that an increase in osteocalcin activity prevents the development of type 2 diabetes and obesity in mice. This discovery potentially opens the door for novel therapeutic avenues for the prevention and treatment of type 2 diabetes.

"The discovery that our bones are responsible for regulating blood sugar in ways that were not known before completely changes our understanding of the function of the skeleton and uncovers a crucial aspect of energy metabolism," said Gerard Karsenty, M.D., Ph.D., chair of the department of Genetics and Development at Columbia University Medical Center, Paul Marks Professor in the Basic Sciences, and senior author of the paper. "These results uncover an important aspect of endocrinology that was unappreciated until now."

Karsenty and his colleagues had previously shown that leptin, a hormone released by fat cells, acts upon and ultimately controls bone mass. They reasoned that bones must in turn communicate with fat, so they searched bone-forming cells for molecules that could potentially send signals back to fat cells.

The researchers found that osteocalcin, a protein made only by bone-forming cells (osteoblasts), was not a mere structural protein, but rather a hormone with totally unanticipated and crucial functions. Osteocalcin directs the pancreas' beta cells, which produce the body's supply of insulin, to produce more insulin. At the same time, osteocalcin directs fat cells to release a hormone called adiponectin, which improves insulin sensitivity.

This discovery showed for the first time that one hormone has a synergistic function in regulating insulin secretion and insulin sensitivity, and that this coordinating signal comes from the skeleton. Additionally, osteocalcin enhances the production of insulin-producing beta cells, which is considered one of the best, but currently unattainable, strategies to treat diabetes.

People with type 2 diabetes have been shown to have low osteocalcin levels, suggesting that altering the activity of this molecule could be an effective therapy. That hypothesis is supported by the Columbia research, which showed that mice with high levels of osteocalcin activity were prevented from gaining weight or becoming diabetic even when they ate a high fat diet. Analysis of mice lacking the osteocalcin protein showed that they had type 2 diabetes, increased fat mass, a decrease in insulin and adiponectin expression, and decreased beta-cell proliferation.

This research was supported by the National Institutes of Health, the American Diabetes Association, the Japan Society for the Promotion of Science, and the Pennsylvania Department of Health.

The researchers are now examining the role of osteocalcin in the regulation of blood sugar in humans and are continuing investigations into the relationship between osteocalcin and the appearance of type 2 diabetes and obesity.

My Comment: Thus far I’ve shown how religious thinking is somehow (zeitgeist) creeping into scientific thought. Now, let’s look at how scientific thinking can change and influence religious thought. Here we have a finding that can change the way that Hindus view Chakras, and how Jews view the complex correspondence between the 7 days, the 7 endocrine glands, and the 7 winds of the shel yad Tefillin. Basically, you have to rethink your idea of the endocrine system, which in fact will change the way the system works. And therefore, if you are religious, it’s got to change your fundamental intention when working with the Chakras or when davening with tefillin. (Unless in both cases you’re just going through the motions, in which case, just forget you even read this.) And of course, by working through these common and important physical ailments with spiritual technology--well, that's a good thing. Of course, it would be in an experimental stage, and would require an increasing level of skill. But it would be good if we could.

Monday, November 16, 2009

Is This the Biggest Elephant of the Herd Living in the Living Room?

DNA Found to Have "Impossible" Telepathic Properties

DNA has been found to have a bizarre ability to put itself together, even at a distance, when according to known science it shouldn't be able to. Explanation: None, at least not yet.

Scientists are reporting evidence that contrary to our current beliefs about what is possible, intact double-stranded DNA has the “amazing” ability to recognize similarities in other DNA strands from a distance. Somehow they are able to identify one another, and the tiny bits of genetic material tend to congregate with similar DNA. The recognition of similar sequences in DNA’s chemical subunits, occurs in a way unrecognized by science. There is no known reason why the DNA is able to combine the way it does, and from a current theoretical standpoint this feat should be chemically impossible.

Even so, the research published in ACS’ Journal of Physical Chemistry B, shows very clearly that homology recognition between sequences of several hundred nucleotides occurs without physical contact or presence of proteins. Double helixes of DNA can recognize matching molecules from a distance and then gather together, all seemingly without help from any other molecules or chemical signals.

In the study, scientists observed the behavior of fluorescently tagged DNA strands placed in water that contained no proteins or other material that could interfere with the experiment. Strands with identical nucleotide sequences were about twice as likely to gather together as DNA strands with different sequences. No one knows how individual DNA strands could possibly be communicating in this way, yet somehow they do. The “telepathic” effect is a source of wonder and amazement for scientists.

“Amazingly, the forces responsible for the sequence recognition can reach across more than one nanometer of water separating the surfaces of the nearest neighbor DNA,” said the authors Geoff S. Baldwin, Sergey Leikin, John M. Seddon, and Alexei A. Kornyshev and colleagues.

This recognition effect may help increase the accuracy and efficiency of the homologous recombination of genes, which is a process responsible for DNA repair, evolution, and genetic diversity. The new findings may also shed light on ways to avoid recombination errors, which are factors in cancer, aging, and other health issues.

Posted by Rebecca Sato.

My Comment: Heckuva anomaly. Now, if you are a scientist do you overlook this? If you are a theologian or just plain religious, you might want to make this a cornerstone of your worldview.

Saturday, November 14, 2009

Science Really Really Really Wants to Discover Chi

A Rosetta Stone For Traditional Chinese Medicine

ScienceDaily (Oct. 30, 2007) — Scientists in the United Kingdom have "decoded" the inscrutable language of traditional Chinese medicine (TCM), revealing its strong chemical foundation in a way that may help scientists mine age-old Chinese medicines to develop tomorrow's new drugs.

David J. Barlow, Thomas M. Ehrman, and Peter J. Hylands point out that traditional Chinese medicine (TCM) - regarded by many Western experts as an archaic system doomed to extinction 50 years ago - has undergone a "remarkable renaissance" in recent years.

However, the arcane language used to describe categories of medication in TCM has hindered effective understanding of one of the most developed and mature systems of alternative medicine in existence.

To overcome that barrier, the researchers analyzed patterns among 8411 compounds from 240 Chinese herbs in relation to the categories found in traditional Chinese medicine. Organizing their findings in a kind of herbal "map," their results reveal that many categories in Chinese medicine are amenable to translation to Western terminology. TCM's "fire poison" group, for example, is comparable to today's family of anti-inflammatory medicines.

Now, future researchers will better understand the chemical basis of remedies that have been in use for thousands of years, the study indicated.

"This is likely to be of benefit both in the search for new drugs and, equally significantly, in understanding how Chinese medicine works," say the authors.

The study is "Phytochemical Informatics of Traditional Chinese Medicine and Therapeutic Relevance" is scheduled for the Nov./Dec. issue of ACS' Journal of Chemical Information and Modeling.

My Comment: This is pretty much what I’m talking about with every post, that the major threads of world thought seem to converge if you don’t get hung up on language and jargon. The idea here is for YOU to become a Rosetta Stone. It’s fun and the whole family can play.

Monday, November 2, 2009

An Important Question

Scientists Control Living Cells With Light; Advances Could Enhance Stem Cells' Power

ScienceDaily (Aug. 12, 2009) — University of Central Florida researchers have shown for the first time that light energy can gently guide and change the orientation of living cells within lab cultures. That ability to optically steer cells could be a major step in harnessing the healing power of stem cells and guiding them to areas of the body that need help.

The results, presented at the 2009 Conference on Lasers and Electro-Optics/International Quantum Electronics Conference, were discovered by a research team led by Aristide Dogariu, an optical scientist at the College of Optics and Photonics, and Kiminobu Sugaya, a stem cell researcher at the College of Medicine's Burnett School of Biomedical Sciences.

Long-term implications of the work include stimulating and controlling tissue regeneration for cleaner wound healing and the possibility of altering the shapes of cells and preventing malignant tumors from spreading throughout the body.

While optical techniques such as drilling microscopic holes with light or using the light as tweezers have shown promise in manipulating small pieces of matter, the UCF team explored the use of a gentler light energy. Their work showed for the first time that optically induced torques can affect components within cells that drive their motility -- their ability to move spontaneously -- and change the orientation of cells within cultures.

While earlier studies of cell manipulation have emphasized shielding the cell from the power of the light, Dogariu and Sugaya focused on using that energy to stimulate the cells' natural tendencies.

Living cells use energy to move actively and spontaneously. To influence them without jeopardizing their chemical makeup was a tremendous challenge. Dogariu and Sugaya began exploring the idea of moving an entire cell by focusing on its inner mechanisms. Inside the cells there are slender rods made up of a protein called actin.

"Actin rods are constantly vibrating, causing the cells to move sporadically" Sugaya said. The researchers demonstrated that low-intensity polarized light can guide the rods' Brownian motion to ever-so-slowly line up and move in the desired direction.

"Stronger light would simply kill them," Dogariu said. "We wanted to gently help the cells do their job the way they know how to do it."

A time-lapse video shows that after more than two hours of exposure to light with specific characteristics, a group of stem cells migrates from a seemingly random mix of shapes, movement and sizes to a uniform lineup.


My Comment: A question really. Is human consciousness a form of light? And are we again hinting at the existence of Chi? Seems like it.

Poet Wanted



My Comment: Where's a poet when you need one? And it would ADD to the beauty of the science. Of course poet's usually don't want to write about this, rather they would write about how suicide is the most dramatic and meaningful way to live life. And to think the poets used to kick out butts in the annual poet, fiction writers football game. It's a shame.

Describe This



My Comment: There are subtitles that describe the scenes with nouns. But isn't it just as logical to describe it with metaphors. And this is the Rosetta Stone that allows you to transition between ancient writings and modern science, the use of metaphors. There is a difference between understanding the Torah literally, AND understanding as precise metaphors too. And there's definitely no interpretative postulate against doing this.

Friday, October 30, 2009

Discredited Old--uh--New Cutting Edge Science

Clean Smells Promote Moral Behavior, Study Suggests

ScienceDaily (Oct. 26, 2009) — People are unconsciously fairer and more generous when they are in clean-smelling environments, according to a soon-to-be published study led by a Brigham Young University professor.

The research found a dramatic improvement in ethical behavior with just a few spritzes of citrus-scented Windex.

Katie Liljenquist, assistant professor of organizational leadership at BYU's Marriott School of Management, is the lead author on the piece in a forthcoming issue of Psychological Science. Co-authors are Chen-Bo Zhong of the University of Toronto's Rotman School of Management and Adam Galinsky of the Kellogg School of Management at Northwestern University.

The researchers see implications for workplaces, retail stores and other organizations that have relied on traditional surveillance and security measures to enforce rules.

"Companies often employ heavy-handed interventions to regulate conduct, but they can be costly or oppressive," said Liljenquist, whose office smells quite average. "This is a very simple, unobtrusive way to promote ethical behavior."

Perhaps the findings could be applied at home, too, Liljenquist said with a smile. "Could be that getting our kids to clean up their rooms might help them clean up their acts, too."

The study titled "The Smell of Virtue" was unusually simple and conclusive. Participants engaged in several tasks, the only difference being that some worked in unscented rooms, while others worked in rooms freshly spritzed with Windex.

The first experiment evaluated fairness.

As a test of whether clean scents would enhance reciprocity, participants played a classic "trust game." Subjects received $12 of real money (allegedly sent by an anonymous partner in another room). They had to decide how much of it to either keep or return to their partners who had trusted them to divide it fairly. Subjects in clean-scented rooms were less likely to exploit the trust of their partners, returning a significantly higher share of the money.

•The average amount of cash given back by the people in the "normal" room was $2.81. But the people in the clean-scented room gave back an average of $5.33.

The second experiment evaluated whether clean scents would encourage charitable behavior.

Subjects indicated their interest in volunteering with a campus organization for a Habitat for Humanity service project and their interest in donating funds to the cause.

•Participants surveyed in a Windex-ed room were significantly more interested in volunteering (4.21 on a 7-point scale) than those in a normal room (3.29).
•22 percent of Windex-ed room participants said they'd like to donate money, compared to only 6 percent of those in a normal room.

Follow-up questions confirmed that participants didn't notice the scent in the room and that their mood at the time of the experiment didn't affect the outcomes.

"Basically, our study shows that morality and cleanliness can go hand-in-hand," said Galinsky of the Kellogg School. "Researchers have known for years that scents play an active role in reviving positive or negative experiences. Now, our research can offer more insight into the links between people's charitable actions and their surroundings."

While this study examined the influence of the physical environment on morality, Zhong and Liljenquist previously published work that demonstrated an intimate link between morality and physical cleanliness. Their 2006 paper in Science reported that transgressions activated a desire to be physically cleansed.

Liljenquist is now researching how perceptions of cleanliness shape our impressions of people and organizations. "The data tell a compelling story about how much we rely upon cleanliness cues to make a wide range of judgments about others," she said.

My Comment: A few points here. The first is that this theory, the correlation between odor and behaviors, odors and illness, was abandoned for the germ theory of disease. What do we usually do with old theories? We tend to put them on a trash heap of anitquated ideas--and I don't think this is the right thing to do. We treat older scientists, old customs, old thinking of any type as if those holding those thoughts didn't have a firm grasp on reality. We think of them as being less intelligent, less concerned with empirical data, less skilled at taking their perceptions and turning those perceptions into accurate observations about cause and effect and the world in which we live. Well, that's not true. The older scientists and priests and rabbis were not necessarily less intelligent, or less rigorous in their attempts to accurately incorporate their observations into something more substantial than opinion. In fact, it is quite possible that if they did miss the mark, they didn't miss it by much. It is reasonable to ask ourselves what is the data that led them to their conclusions. There may be something very valuable in those old notions about the world. The findings in these experiments are just one example.

Monday, October 19, 2009

Tip of a Large Iceberg

Where's The Science? The Sorry State Of Psychotherapy

ScienceDaily (Oct. 3, 2009) — The prevalence of mental health disorders in this country has nearly doubled in the past 20 years. Who is treating all of these patients? Clinical psychologists and therapists are charged with the task, but many are falling short by using methods that are out of date and lack scientific rigor. This is in part because many of the training programs—especially some Doctorate of Psychology (PsyD) programs and for-profit training centers—are not grounded in science.

A new report in Psychological Science in the Public Interest, a journal of the Association for Psychological Science, by a panel of distinguished clinical scientists—Timothy Baker (University of Wisconsin-Madison), Richard McFall (Indiana University), and Varda Shoham (University of Arizona)—calls for the reform of clinical psychology training programs and appeals for a new accreditation system to ensure that mental health clinicians are trained to use the most effective and current research to treat their patients.

There are multiple practices in clinical psychology that are grounded in science and proven to work, but in the absence of standardized science-based training, those treatments go unused.

For example, cognitive-behavioral therapy (CBT) has been shown to be the most effective treatment for PTSD and has the fewest side-effects, yet many psychologists do not use this method. Baker and colleagues cite one study in which only 30 percent of psychologists were trained to perform CBT for PTSD and only half of those psychologists elected to use it. That means that six of every seven sufferers were not getting the best care available from their clinicians. Furthermore, CBT shows both long-term and immediate benefits as a treatment for PTSD; whereas medications such as Paxil have shown 25 to 50 percent relapse rates.

The report suggests that the escalating cost of mental health care treatment has reduced the use of psychological treatments and shifted care to general health care facilities. The authors also stress the importance of coupling psychosocial interventions with medicine because many behavioral therapies have been shown to reduce costs and provide longer term benefits for the client.

Baker and colleagues conclude that a new accreditation system is the key to reforming training in clinical psychology. This new system is already under development: the Psychological Clinical Science Accreditation System (PCSAS http://www.pcsas.org).

My comment:The situation is unbelievably bleak--that most psychotherapists are quite proud of their ignorance of science--a quick examination of their beliefs would reveal an almost medieval thinking. There is a silver lining, that psychotherapy displaced religion as the cognitive technology we use to understand human behavior. It's gone now, although psychotherapists will fight you on this point. But religion hasn't really picked up the ball here, encased in another kind of medievalism--not the enlightened thought of the Middle Ages that brought a mini-industrial revolution. Religion, Judaism has gone into an anti-intellectual cocoon. There is, however, room for the butterfly.

Tuesday, October 13, 2009

Hard Wired for Life and Good--Part II

I Didn't Sin—It Was My Brain

10.05.2009

Brain researchers have found the sources of many of our darkest thoughts, from envy to wrath.

by Kathleen McGowan

Why does being bad feel so good? Pride, envy, greed, wrath, lust, gluttony, and sloth: It might sound like just one more episode of The Real Housewives of New Jersey, but this enduring formulation of the worst of human failures has inspired great art for thousands of years. In the 14th century Dante depicted ghoulish evildoers suffering for eternity in his masterpiece, The Divine Comedy. Medieval muralists put the fear of God into churchgoers with lurid scenarios of demons and devils. More recently George Balanchine choreographed their dance.

Today these transgressions are inspiring great science, too. New research is explaining where these behaviors come from and helping us understand why we continue to engage in them—and often celebrate them—even as we declare them to be evil. Techniques such as functional magnetic resonance imaging (fMRI), which highlights metabolically active areas of the brain, now allow neuroscientists to probe the biology behind bad intentions.

The most enjoyable sins engage the brain’s reward circuitry, including evolutionarily ancient regions such as the nucleus accumbens and hypothalamus; located deep in the brain, they provide us such fundamental feelings as pain, pleasure, reward, and punishment. More disagreeable forms of sin such as wrath and envy enlist the dorsal anterior cingulate cortex (dACC). This area, buried in the front of the brain, is often called the brain’s “conflict detector,” coming online when you are confronted with contradictory information, or even simply when you feel pain. The more social sins (pride, envy, lust, wrath) recruit the medial prefrontal cortex (mPFC), brain terrain just behind the forehead, which helps shape the awareness of self.

No understanding of temptation is complete without considering restraint, and neuroscience has begun to illuminate this process as well. As we struggle to resist, inhibitory cognitive control networks involving the front of the brain activate to squelch the impulse by tempering its appeal. Meanwhile, research suggests that regions such as the caudate—partly responsible for body movement and coordination—suppress the physical impulse. It seems to be the same whether you feel a spark of lechery, a surge of jealousy, or the sudden desire to pop somebody in the mouth: The two sides battle it out, the devilish reward system versus the angelic brain regions that hold us in check.

It might be too strong to claim that evolution has wired us for sin, but excessive indulgence in lust or greed could certainly put you ahead of your competitors. “Many of these sins you could think of as virtues taken to the extreme,” says Adam Safron, a research consultant at Northwestern University whose neuroimaging studies focus on sexual behavior. “From the perspective of natural selection, you want the organism to eat, to procreate, so you make them rewarding. But there’s a potential for that process to go beyond the bounds.”

There is no sin center in the brain, no single node of fiendishness that we might be able to shut down with drugs or electrodes. With the advent of modern imaging techniques that peer into the brain as it functions, though, we at least gain some perspective on our bad habits. At the same time, we can indulge in another gratifying pastime: As other people misbehave, we can sit back and watch.

LUST

In the annals of sin, weaknesses of the flesh—lust, gluttony, sloth—are considered second-tier offenses, less odious than the “spiritual” sins of envy and pride. That’s good news for us, since these yearnings are notoriously difficult to suppress.

When it comes to lust, neuroimaging confirms that the prurient urge is all-encompassing. Watching pornography calls upon brain regions associated with reward, sensory interpretation, and visual processing. It enlists the amygdala and the hypothalamus, which deal with emotional information; it also stimulates the reward-processing ventral striatum, probably due to the satisfying nature of watching erotic stimuli.

All said, the most notable thing about lust is that it sets nearly the whole brain buzzing, Safron says.

These responses are so unique and distinctive that, in the context of an experiment, it is possible to determine whether a man is aroused just by looking at an fMRI brain scan. “These are huge effects,” Safron says. “You’re looking at the difference between something that elicits intense desire and something that does not.” (Women show a less spectacular response, Safron says, and it is unclear exactly why.)

If lechery is all-consuming, how do we ever manage to control it? As with other powerful impulses, we try to shut down arousal by calling upon the right superior frontal gyrus and right anterior cingulate gyrus, according to research led by Mario Beauregard of the University of Montreal. He and others propose that these brain areas form a conscious self-regulatory system. This network provides us with the evolutionarily unprecedented ability to control our own neural processing—a feat achieved by no other creature.

GLUTTONY

Today it is difficult to regard overeating as a sin, considering the overwhelming evidence that physiology plays a more powerful role than morals in appetite and indulgence.

Physician Gene-Jack Wang of Brookhaven National Laboratory has studied the brains of overeaters since 1999, when he and colleague Nora Volkow originally observed that obesity and drug addiction alter the same brain circuits. These pathways, which rely on the neurotransmitter dopamine, are often referred to simplistically as the “reward system” but are also involved in motivation, attention, decision making, and other complex functions. In their studies, Wang and Volkow found that both drug addicts and obese people are usually less sensitive to dopamine’s rewarding effects. Being relatively numb to the pleasure and motivation signal may make them more likely to chase after a stronger thrill: more food or a bump of cocaine. Excessive stimulation further desensitizes dopaminergic neurons, and the compulsion snowballs.

In some of his experiments, Wang asks his volunteers to come hungry. He then torments them, asking them to describe their favorite food in loving detail while he heats it up in a nearby microwave so that the aroma wafts through the room. When these miserable souls go into a positron-emission tomography (PET) scanner, Wang sees the motivation regions of their brains go wild. Parts of the orbital frontal cortex, which is implicated in decision making, also light up.

In the brains of obese people, the regions that regulate sensory information from the mouth and tongue are more active, suggesting that they may experience the sensations of eating differently. While sensory processing is elevated in many of these subjects, other research shows that their reward sensitivity is lower. The dorsolateral prefrontal cortex (dlPFC) and other areas involved in inhibitory control are underactive; the heavier the person, the lower the activity there. “Overeating downregulates your inhibition control,” Wang says.

For the gluttonous, neuroscience offers moral absolution. After all, Saint Thomas Aquinas asserted that a sin must always be voluntary, or else it is not really a sin. “Our brain evolved for us to eat in excess, in order to survive,” Wang says. “This kind of excess is built into the brain.”

SLOTH

Mere laziness does not seem to qualify as a truly deadly sin. It helps to know that this moral failing was originally conceived of as acedia, an outmoded term that conveys both alienation and tedium, tinged with self-contempt. Acedia afflicted jaded monks who had grown weary of the cloistered life. Their sin was turning away from their moral obligations and toward selfish pursuits—a monastic form of ennui.
Today, paralyzing lassitude is often seen as a symptom of disease rather than of turpitude. Apathy is a classic sign of frontotemporal dementia. In this neurodegenerative disorder, the frontal lobes of the brain are slowly eaten away, causing social and mood changes as well as cognitive decline. Patients with such dementia often become increasingly withdrawn.

Sadness and listlessness are also hallmarks of major depression. With frontotemporal dementia the symptoms are caused by dead and dying cells; in depression the root cause is still unknown. Interestingly, the dorsolateral prefrontal cortex has an unusual pattern of activation in both conditions. Related to its ability to inhibit impulses, this region has a role in sustaining attention over the long haul, which is necessary for motivation. Abnormal function in the dlPFC might be connected to the lethargy associated with both conditions. Conversely, activity in this area may keep a lid on negative emotions; in some studies, depression lifted with stimulation of the dlPFC.

PRIDE

Early theologians saw pride as the fundamental sin—the “queen of them all,” according to Pope Gregory the Great, who codified the list of seven deadly sins in the sixth century. Indeed, psychologists say that arrogance comes naturally in Western society. Most of us perceive ourselves as slightly smarter, funnier, more talented, and better-looking than average. These rose-colored glasses are apparently important to mental health, the psychological immune system that protects us from despair. “Those who see themselves as they truly are—not so funny, a bad driver, overweight—have a greater chance of being diagnosed with clinical depression,” says Julian Paul Keenan, director of the cognitive neuroimaging laboratory and professor of psychology at Montclair State University in New Jersey.
For most of us, it takes less mental energy to puff ourselves up than to think critically about our own abilities. In one recent neuroimaging study by Hidehiko Takahashi of the National Institute of Radiological Sciences in Japan, volunteers who imagined themselves winning a prize or trouncing an opponent showed less activation in brain regions associated with introspection and self-conscious thought than people induced to feel negative emotions such as embarrassment. We accept positive feedback about ourselves readily, Takahashi says: “Compared with guilt or embarrassment, pride might be processed more automatically.”

The most notable thing about lust is that it sets nearly the whole brain buzzing.
Pride gets its swagger from the self-related processing of the mPFC, which Keenan calls “a very interesting area of the brain, involved in all these wonderful human characteristics, from planning to abstract thinking to self-awareness.” Using transcranial magnetic stimulation (TMS), in which a magnetic field applied to the scalp temporarily scrambles the signal in small areas of the brain, he was able to briefly shut off the mPFC in volunteers. With TMS switched on, his subjects’ normal, healthy arrogance melted away. “They saw themselves as they really were, without glossing over negative characteristics,” he says.

Righteous humility has traditionally been depicted as the virtue that opposes pride, but the work of Keenan and others calls that into question. He is using TMS to disrupt deliberate self-deprecation—the type of unctuous, ingratiating behavior that seems humble but is actually arrogance in disguise. Patterns of brain activation during self-deprecation are fundamentally the same as those during self-deceptive pride, Keenan is finding. Both are forms of one-upmanship. “They’re in the same location and seem to serve the same purpose: putting oneself ahead in society,” he says.

GREED

Despite the enormous pool of potential research subjects, greed has not yet been systematically investigated in brain research. However, neuroscience does offer insight into a related phenomenon, the indignant outrage of the cheated.
Our hatred of unfairness runs deep, even trumping rational self-interest. In the lab, researchers frequently use the “ultimatum game” to test our responses to injustice. One of two partners is given a sum of money and told that he must offer some amount of his own choosing to his partner. If the partner rejects the offer, neither gets to keep any of the money. On a rational basis, the receiving partner should accept any nonzero offer, since getting some money is always better than getting none. But people’s sense of violation at unfairness is so strong that they reject offers of 20 percent or less about half the time.

It makes sense that we are so sensitive to being cheated, notes Matthew Lieberman, a professor of psychology at the University of California at Los Angeles.

“Mammalian survival depends on social bonds, and fairness is a really important social cue,” he says. Inequitable treatment might be an important sign that we are not valued by the group, he says, so we had better pay attention.

In response to unfair offers, the brain activates the pain detection process that takes place in the multitasking dACC. Interestingly, it also engages the bilateral anterior insula, an area implicated in negative emotions such as anger, disgust, and social rejection. The picture that emerges from fMRI is that of a brain weighing an emotional response (the urge to punish the guy who cheated you) against a logical response (the appeal of the cash).

The delight in someone else’s downfall can be downright blissful.

When Lieberman increased the money being offered, he found that accepting a share that was larger but still unfair—say, $8 out of $23—was linked not with reward circuitry but with increased activity in the ventrolateral prefrontal cortex and downregulation of the anterior insula, changes often seen during the regulation of negative feelings. People seemed to be suppressing their indignant reaction in order to accept a reward that was inequitable but appealing. Similarly, getting a fair offer—even if it was small in absolute terms—activated regions in the brain such as the ventral striatum and the ventromedial prefrontal cortex that are involved in automatic and intuitive reward processing. Justice apparently feels good.

ENVY

The sin of pride turned on its head, envy is the most social of the moral failures, sparked by the excruciating awareness of someone else’s supreme talent, stunning looks, or extremely expensive car. For that reason, it is also the least fun of the deadly sins; feeling jealous provides no dirty thrill.

Only one imaging study (conducted by Takahashi’s group in Japan) has probed the neural basis of envy. Volunteers in fMRI machines were asked to read three scenarios. In the first, “student A” was portrayed as similar to, but better than, the volunteer in every respect. “Student B” was depicted as equally successful but very different from the subject, and “student C” sounded pretty much like a loser. Reading about the awe-inspiring student A activated the volunteers’ conflict-detecting dACC brain region, perhaps responding to the gap between the default setting of self-aggrandizing pride and the ugly truth of someone else’s triumphs. This same region is enlisted when feeling pain, suggesting to Takahashi that envy is a kind of “social pain in the self.”

On the other hand, indulging in schadenfreude—the delight in someone else’s downfall—can be downright blissful. Aquinas termed this “morose delectation” and condemned it as a failure to resist a passion. Indeed, Takahashi found that rejoicing in a rival’s defeat brings pleasure just as surely as envy does pain. In the second phase of his study, volunteers read about student A’s downfall, causing the ventral striatum to light up. The striatum is part of the so-called reward system, which can be activated by such pleasures as money, food, or sex, Takahashi says. “Schadenfreude is a social reward.” The stronger the dACC activation in the first study, the stronger the striatum response in the second.

WRATH

It may not have been the original sin, but rage is certainly primordial: You would think that lust and gluttony would predate any emotion, but much of the brain circuitry active during anger is very basic and very fast. In humans, anger enlists the conflict-detecting dACC, which immediately alerts other regions of the brain to pay attention. The more upset you get, the more it activates, found Tom Denson, a psychologist at the University of New South Wales in Australia. In people with short fuses, this part of the brain seems to be primed to feel provocation and personal slights, Denson says.

Some of us are more easily enraged than others, but few are able to stifle rage completely. Instead we may convert overt hostility into angry brooding. To investigate the difference between short fusers and brooders, Denson antagonized his lab volunteers, insulting them while he scanned their brains. “Within seconds you see differences,” Denson says. The medial prefrontal cortex, associated with self-awareness and emotional regulation, quickly lit up in angry brooders. So did the hippocampus, involved in memory. As they fume, people repeatedly relive the event in their minds. Denson found that the degree of hippocampal activation predicted how much people tended to ruminate.

Probing the underpinnings of vengeful behavior, a German group led by neuropsychologist Ulrike Krämer allowed people who had been provoked during an experiment to punish their antagonist with a blast of extremely annoying noise. While the subjects pondered how loud to set the volume, the dorsal striatum, part of the brain’s reward circuitry, lit up at the prospect of retaliation. “We have this primitive brain that says, ‘Do it! Do it!’” Denson says. Similarly, people asked to imagine themselves engaging in aggressive behavior actively suppress activity in the prefrontal cortex, where social information is processed. By deliberately inhibiting our natural social response, we make ourselves detached enough to strike out.

Historically, moralists have not paid much heed to the findings of science, and it is safe to say that all the brain-scans in the world probably will not persuade modern theologians to recalculate the wages of sin. But they might want to pay heed to one recent finding from modern neuroimaging: It turns out that acting virtuously does not really require a hair shirt. In fact, research suggests it feels pretty good.

Jordan Grafman recently found that virtue literally is its own reward. Altruistic behavior sends reward-related brain systems into a pleasurable tizzy—even more so than the prospect of self-interested gain. “The big punch line is that all things being equal, your reward system fires off a lot more when you’re giving than when you’re taking,” says Grafman, who is chief of the cognitive neuroscience section at the National Institute of Neurological Disorders and Stroke. Call it the dirty little secret about being good: It might be even more fun than being wicked.

My Comment: Not much comment needed about the hard wiring. The Torah takes this further, showing us the fine tuning of these general findings, to the point of 613 commandments, and tells us that we in fact have the ability to make choices, albeit very difficult choices. That is, the fight is intense. None of this is easy. Should we expect others to constantly make the choice to do good? Should we expect it of ourselves? Probably not. At most, we can hope folks embrace the struggle. It's not for the faint of heart. But the Torah shows another aspect to this article. This article shows the complexity of doing good, all of the biological processes that must be overcome. Fine. The Torah tells us that it's even more difficult than this. This article just gives us the top layer of the whole phenomena.

Monday, October 12, 2009

The Slow, Methodical, Slow, Precise, Slow, Scientific March to Discover Chi

Physicists Measure Elusive 'Persistent Current' That Flows Forever

ScienceDaily (Oct. 12, 2009)
— Physicists at Yale University have made the first definitive measurements of “persistent current,” a small but perpetual electric current that flows naturally through tiny rings of metal wire even without an external power source.

The team used nanoscale cantilevers, an entirely novel approach, to indirectly measure the current through changes in the magnetic force it produces as it flows through the ring. “They’re essentially little floppy diving boards with the rings sitting on top,” said team leader Jack Harris, associate professor of physics and applied physics at Yale. The findings appear in the October 9 issue of Science.

The counterintuitive current is the result of a quantum mechanical effect that influences how electrons travel through metals, and arises from the same kind of motion that allows the electrons inside an atom to orbit the nucleus forever.

“These are ordinary, non-superconducting metal rings, which we typically think of as resistors,” Harris said. “Yet these currents will flow forever, even in the absence of an applied voltage.”

Although persistent current was first theorized decades ago, it is so faint and sensitive to its environment that physicists were unable to accurately measure it until now. It is not possible to measure the current with a traditional ammeter because it only flows within the tiny metal rings, which are about the same size as the wires used on computer chips.

Past experiments tried to indirectly measure persistent current via the magnetic field it produces (any current passing through a metal wire produces a magnetic field). They used extremely sensitive magnetometers known as superconducting quantum interference devices, or SQUIDs, but the results were inconsistent and even contradictory.

“SQUIDs had long been established as the tool used to measure extremely weak magnetic fields. It was extremely optimistic for us to think that a mechanical device could be more sensitive than a SQUID,” Harris said.

The team used the cantilevers to detect changes in the magnetic field produced by the current as it changed direction in the aluminum rings. This new experimental setup allowed the team to make measurements a full order of magnitude more precise than any previous attempts. They also measured the persistent current over a wider range of temperature, ring size and magnetic field than ever before.

“These measurements could tell us something about how electrons behave in metals,” Harris said, adding that the findings could lead to a better understanding of how qubits, used in quantum computing, are affected by their environment, as well as which metals could potentially be used as superconductors.

Authors of the paper include Ania Bleszynski-Jayich, William Shanks, Bruno Peaudecerf, Eran Ginossar, Leonid Glazman and Jack Harris (all of Yale University) and Felix von Oppen (Freie Universität Berlin).

Wednesday, September 30, 2009

Is Science on the Way to Discovering Chi?

DNA Is Dynamic and Has High Energy; Not Stiff Or Static As First Envisioned

ScienceDaily (July 14, 2009) — The interaction represented produced the famous explanation of the structure of DNA, but the model pictured is a stiff snapshot of idealized DNA. As researchers from Baylor College of Medicine and the University of Houston note in a report that appears online in the journal Nucleic Acids Research, DNA is not a stiff or static. It is dynamic with high energy. It exists naturally in a slightly underwound state and its status changes in waves generated by normal cell functions such as DNA replication, transcription, repair and recombination.

DNA is also accompanied by a cloud of counterions (charged particles that neutralize the genetic material's very negative charge) and, of course, the protein macromolecules that affect DNA activity.

"Many models and experiments have been interpreted with the static model," said Dr. Lynn Zechiedrich, associate professor of molecular virology and microbiology at BCM and a senior author of the report. "But this model does not allow for the fact that DNA in real life is transiently underwound and overwound in its natural state."

DNA appears a perfect spring that can be stretched and then spring back to its original conformation. How far can you stretch it before something happens to the structure and it cannot bounce back? What happens when it is exposed to normal cellular stresses involved in doing its job? That was the problem that Zechiedrich and her colleagues tackled.

Their results also addresses a question posed by another Nobel laureate, the late Dr. Linus Pauling, who asked how the information encoded by the bases could be read if it is sequestered inside the DNA molecular with phosphate molecules on the outside.

It's easy to explain when the cell divides because the double-stranded DNA also divides at the behest of a special enzyme, making its genetic code readily readable.

"Many cellular activities, however, do not involve the separation of the two strands of DNA," said Zechiedrich.

To unravel the problem, former graduate student, Dr. Graham L. Randall, mentored jointly by Zechiedrich and Dr. B. Montgomery Pettitt of UH, simulated 19 independent DNA systems with fixed degrees of underwinding or overwinding, using a special computer analysis started by Petttitt.

They found that when DNA is underwound in the same manner that you might underwind a spring, the forces induce one of two bases – adenine or thymine – to "flip out" of the sequence, thus relieving the stress that the molecule experiences.

"It always happens in the underwound state," said Zechiedrich. "We wanted to know if torsional stress was the force that accounted for the base flipping that others have seen occur, but for which we had no idea where the energy was supplied to do this very big job."

When the base flips out, it relieves the stress on the DNA, which then relaxes the rest of the DNA not involved in the base flipping back to its "perfect spring" state.

When the molecule is overwound, it assumes a "Pauling-like DNA" state in which the DNA turns itself inside out to expose the bases -- much in the way Pauling had predicted.

Zechiedrich and her colleagues theorize that the base flipping, denaturation, and Pauling-like DNA caused by under- and overwinding allows DNA to interact with proteins during processes such as replication, transcription and recombination and allows the code to be read. And back to the idea of the "perfect spring" behavior of the DNA helix - "This notion is entirely wrong," said Zechiedrich. "Underwinding is not equal and opposite to overwinding, as predicted, not by a long shot, that's really a cool result that Graham got."

Support for this work came from the Robert A. Welch Foundation, the National Institutes of Health and the Keck Center for Interdisciplinary Bioscience Training of the Gulf Coast Consortia. The computations were performed in part using the Teragrid and the Molecular Science Computing 85 Facility in the William R. Wiley Environmental Molecular Sciences Laboratory, sponsored by the U.S. Department of Energy's Office of Biological and Environmental Research and located at the Pacific Northwest National Laboratory.


My Comment: Slowly, tiny step after tiny step, is Western Science getting closer to ‘discovering’ what the ancient cultures have long known, that a Life Force or Chi exists?

Tuesday, September 29, 2009

Hard Wired for Life

Brain Innately Separates Living And Non-living Objects For Processing

ScienceDaily (Aug. 14, 2009) — For unknown reasons, the human brain distinctly separates the handling of images of living things from images of non-living things, processing each image type in a different area of the brain. For years, many scientists have assumed the brain segregated visual information in this manner to optimize processing the images themselves, but new research shows that even in people who have been blind since birth the brain still separates the concepts of living and non-living objects.

The research, published in the Cell Press journal Neuron, implies that the brain categorizes objects based on the different types of subsequent consideration they demand—such as whether an object is edible, or is a landmark on the way home, or is a predator to run from. They are not categorized entirely by their appearance.

"If both sighted people and people with blindness process the same ideas in the same parts of the brain, then it follows that visual experience is not necessary in order for those aspects of brain organization to develop," says Bradford Mahon, postdoctoral fellow in the Department of Brain and Cognitive Sciences at the University of Rochester, and lead author of the study. "We think this means significant parts of the brain are innately structured around a few domains of knowledge that were critical in humans' evolutionary history."

Previous studies have shown that the sight of certain objects, such as a table or mountain, activate regions of the brain other than does the sight of living objects, such as an animal or face—but why the brain would choose to process these two categories differently has remained a mystery, says Mahon. Since the regions were known to activate when the objects were seen, scientists wondered if something about the visual appearance of the objects determined how the brain would process them. For instance, says Mahon, most living things have curved forms, and so many scientists thought the brain prefers to processes images of living things in an area that is optimized for curved forms.

To see if the appearance of objects is indeed key to how the brain conducts its processing, Mahon and his team, led by Alfonso Caramazza, director of the Cognitive Neuropsychology Laboratory at Harvard University, asked people who have been blind since birth to think about certain living and non-living objects. These people had no visual experience at all, so their brains necessarily determined where to do the processing using some criteria other than an object's appearance.

"When we looked at the MRI scans, it was pretty clear that blind people and sighted people were dividing up living and non-living processing in the same way," says Mahon. "We think these findings strongly encourage the view that the human brain's organization innately anticipates the different types of computations that must be carried out for different types of objects."

Mahon thinks it's possible that other parts of the human brain are innately structured around categories of knowledge that may have been important in human evolution. For instance, he says, facial expressions need a specific kind of processing linked to understanding emotions, whereas a landmark needs to be processed in conjunction with a sense of spatial awareness. The brain might choose to process these things in different areas of the brain because those areas have strong connections to other processing centers specializing in emotion or spatial awareness, says Mahon.

Mahon is now working on new experiments designed to further our understanding of how the brain represents knowledge of different classes of objects, both in sighted and blind individuals, as well as in stroke patients.
The data for the study were collected at the Center for Mind/Brain Sciences at the University of Trento in Italy.


My Comment
: Here is another area of agreement between the latest discoveries within science and our ancient teachings. Somewhere inside of us, in the case of the article, within the brain, and for Jews, not just the brain but more, we are hard wired for things. I say ‘things’ in a vague way because in both science and within us, we may be aware of being hard wired but we aren’t terribly aware of what we are wired for. Nor are we aware of what the wiring is. There is a disconnect between this wiring and our conscious awareness. In the Torah the joke is on us—because Gd essentially spells out the hard wire schemata, but our awareness is in such shambles that we can’t see it, or we misinterpret it, even though we read it every year. The Torah makes a clear connection between choosing life AND good over death AND evil. And you see how easily civilization botches the understanding of this simple statement. The good news is that we have the understanding within us, the difficult part is healing the chasm. And this is life.

Tuesday, September 22, 2009

Anomalies and Miracles

13 Bits of Science that Do Not Make Sense

1 The placebo effect

Don't try this at home. Several times a day, for several days, you induce pain in someone. You control the pain with morphine until the final day of the experiment, when you replace the morphine with saline solution. Guess what? The saline takes the pain away.

This is the placebo effect: somehow, sometimes, a whole lot of nothing can be very powerful. Except it's not quite nothing. When Fabrizio Benedetti of the University of Turin in Italy carried out the above experiment, he added a final twist by adding naloxone, a drug that blocks the effects of morphine, to the saline. The shocking result? The pain-relieving power of saline solution disappeared.

So what is going on? Doctors have known about the placebo effect for decades, and the naloxone result seems to show that the placebo effect is somehow biochemical. But apart from that, we simply don't know.

Benedetti has since shown that a saline placebo can also reduce tremors and muscle stiffness in people with Parkinson's disease. He and his team measured the activity of neurons in the patients' brains as they administered the saline. They found that individual neurons in the subthalamic nucleus (a common target for surgical attempts to relieve Parkinson's symptoms) began to fire less often when the saline was given, and with fewer "bursts" of firing - another feature associated with Parkinson's. The neuron activity decreased at the same time as the symptoms improved: the saline was definitely doing something.

We have a lot to learn about what is happening here, Benedetti says, but one thing is clear: the mind can affect the body's biochemistry. "The relationship between expectation and therapeutic outcome is a wonderful model to understand mind-body interaction," he says. Researchers now need to identify when and where placebo works. There may be diseases in which it has no effect. There may be a common mechanism in different illnesses. As yet, we just don't know.

2 The horizon problem

Our universe appears to be unfathomably uniform. Look across space from one edge of the visible universe to the other, and you'll see that the microwave background radiation filling the cosmos is at the same temperature everywhere. That may not seem surprising until you consider that the two edges are nearly 28 billion light years apart and our universe is only 14 billion years old.

Nothing can travel faster than the speed of light, so there is no way heat radiation could have traveled between the two horizons to even out the hot and cold spots created in the big bang and leave the thermal equilibrium we see now.

This "horizon problem" is a big headache for cosmologists, so big that they have come up with some pretty wild solutions. "Inflation", for example.
You can solve the horizon problem by having the universe expand ultra-fast for a time, just after the big bang, blowing up by a factor of 1050 in 10-33 seconds. But is that just wishful thinking? "Inflation would be an explanation if it occurred," says University of Cambridge astronomer Martin Rees. The trouble is that no one knows what could have made that happen – but see Inside inflation: after the big bang.

So, in effect, inflation solves one mystery only to invoke another. A variation in the speed of light could also solve the horizon problem - but this too is impotent in the face of the question "why?" In scientific terms, the uniform temperature of the background radiation remains an anomaly.

A variation in the speed of light could solve the problem, but this too is impotent in the face of the question 'why?'

3 Ultra-energetic cosmic rays

For more than a decade, physicists in Japan have been seeing cosmic rays that should not exist. Cosmic rays are particles - mostly protons but sometimes heavy atomic nuclei - that travel through the universe at close to the speed of light. Some cosmic rays detected on Earth are produced in violent events such as supernovae, but we still don't know the origins of the highest-energy particles, which are the most energetic particles ever seen in nature. But that's not the real mystery.

As cosmic-ray particles travel through space, they lose energy in collisions with the low-energy photons that pervade the universe, such as those of the cosmic microwave background radiation. Einstein's special theory of relativity dictates that any cosmic rays reaching Earth from a source outside our galaxy will have suffered so many energy-shedding collisions that their maximum possible energy is 5 × 1019 electronvolts. This is known as the Greisen-Zatsepin-Kuzmin limit.

Over the past decade, however, the University of Tokyo's Akeno Giant Air Shower Array - 111 particle detectors spread out over 100 square kilometres - has detected several cosmic rays above the GZK limit. In theory, they can only have come from within our galaxy, avoiding an energy-sapping journey across the cosmos. However, astronomers can find no source for these cosmic rays in our galaxy. So what is going on?

One possibility is that there is something wrong with the Akeno results. Another is that Einstein was wrong. His special theory of relativity says that space is the same in all directions, but what if particles found it easier to move in certain directions? Then the cosmic rays could retain more of their energy, allowing them to beat the GZK limit.

Physicists at the Pierre Auger experiment in Mendoza, Argentina, are now working on this problem. Using 1600 detectors spread over 3000 square kilometres, Auger should be able to determine the energies of incoming cosmic rays and shed more light on the Akeno results.

Alan Watson, an astronomer at the University of Leeds, UK, and spokesman for the Pierre Auger project, is already convinced there is something worth following up here. "I have no doubts that events above 1020 electronvolts exist. There are sufficient examples to convince me," he says. The question now is, what are they? How many of these particles are coming in, and what direction are they coming from? Until we get that information, there's no telling how exotic the true explanation could be.

Update: Follow the latest hunt for GZK neutrinos.

4 Belfast homeopathy results

Madeline Ennis, a pharmacologist at Queen's University, Belfast, was the scourge of homeopathy. She railed against its claims that a chemical remedy could be diluted to the point where a sample was unlikely to contain a single molecule of anything but water, and yet still have a healing effect. Until, that is, she set out to prove once and for all that homeopathy was bunkum.

In her most recent paper, Ennis describes how her team looked at the effects of ultra-dilute solutions of histamine on human white blood cells involved in inflammation. These "basophils" release histamine when the cells are under attack. Once released, the histamine stops them releasing any more. The study, replicated in four different labs, found that homeopathic solutions - so dilute that they probably didn't contain a single histamine molecule - worked just like histamine. Ennis might not be happy with the homeopaths' claims, but she admits that an effect cannot be ruled out.

So how could it happen? Homeopaths prepare their remedies by dissolving things like charcoal, deadly nightshade or spider venom in ethanol, and then diluting this "mother tincture" in water again and again. No matter what the level of dilution, homeopaths claim, the original remedy leaves some kind of imprint on the water molecules. Thus, however dilute the solution becomes, it is still imbued with the properties of the remedy.

You can understand why Ennis remains skeptical. And it remains true that no homeopathic remedy has ever been shown to work in a large randomised placebo-controlled clinical trial. But the Belfast study (Inflammation Research, vol 53, p 181) suggests that something is going on. "We are," Ennis says in her paper, "unable to explain our findings and are reporting them to encourage others to investigate this phenomenon." If the results turn out to be real, she says, the implications are profound: we may have to rewrite physics and chemistry.

5 Dark matter

Take our best understanding of gravity, apply it to the way galaxies spin, and you'll quickly see the problem: the galaxies should be falling apart. Galactic matter orbits around a central point because its mutual gravitational attraction creates centripetal forces. But there is not enough mass in the galaxies to produce the observed spin.

Vera Rubin, an astronomer working at the Carnegie Institution's department of terrestrial magnetism in Washington DC, spotted this anomaly in the late 1970s. The best response from physicists was to suggest there is more stuff out there than we can see. The trouble was, nobody could explain what this "dark matter" was.

And they still can't. Although researchers have made many suggestions about what kind of particles might make up dark matter, there is no consensus. It's an embarrassing hole in our understanding. Astronomical observations suggest that dark matter must make up about 90 per cent of the mass in the universe, yet we are astonishingly ignorant what that 90 per cent is.

Maybe we can't work out what dark matter is because it doesn't actually exist. That's certainly the way Rubin would like it to turn out. "If I could have my pick, I would like to learn that Newton's laws must be modified in order to correctly describe gravitational interactions at large distances," she says. "That's more appealing than a universe filled with a new kind of sub-nuclear particle."

Update: Some scientists are trying to create the stuff themselves. See Let there be dark matter.

If the results turn out to be real, the implications are profound. We may have to rewrite physics and chemistry.

6 Viking's methane

July 20, 1976. Gilbert Levin is on the edge of his seat. Millions of kilometres away on Mars, the Viking landers have scooped up some soil and mixed it with carbon-14-labelled nutrients. The mission's scientists have all agreed that if Levin's instruments on board the landers detect emissions of carbon-14-containing methane from the soil, then there must be life on Mars.

Viking reports a positive result. Something is ingesting the nutrients, metabolising them, and then belching out gas laced with carbon-14.
So why no party?

Because another instrument, designed to identify organic molecules considered essential signs of life, found nothing. Almost all the mission scientists erred on the side of caution and declared Viking's discovery a false positive. But was it?
The arguments continue to rage, but results from NASA's latest rovers show that the surface of Mars was almost certainly wet in the past and therefore hospitable to life. And there is plenty more evidence where that came from, Levin says. "Every mission to Mars has produced evidence supporting my conclusion. None has contradicted it."

Levin stands by his claim, and he is no longer alone. Joe Miller, a cell biologist at the University of Southern California in Los Angeles, has re-analysed the data and he thinks that the emissions show evidence of a circadian cycle. That is highly suggestive of life.

Levin is petitioning ESA and NASA to fly a modified version of his mission to look for "chiral" molecules. These come in left or right-handed versions: they are mirror images of each other. While biological processes tend to produce molecules that favour one chirality over the other, non-living processes create left and right-handed versions in equal numbers. If a future mission to Mars were to find that Martian "metabolism" also prefers one chiral form of a molecule to the other, that would be the best indication yet of life on Mars.

Something on Mars is ingesting nutrients, metabolising them and then belching out radioactive methane.

7 Tetraneutrons

Four years ago, a particle accelerator in France detected six particles that should not exist (see Ghost in the atom). They are called tetraneutrons: four neutrons that are bound together in a way that defies the laws of physics.

Francisco Miguel Marquès and colleagues at the Ganil accelerator in Caen are now gearing up to do it again. If they succeed, these clusters may oblige us to rethink the forces that hold atomic nuclei together.

The team fired beryllium nuclei at a small carbon target and analysed the debris that shot into surrounding particle detectors. They expected to see evidence for four separate neutrons hitting their detectors. Instead the Ganil team found just one flash of light in one detector. And the energy of this flash suggested that four neutrons were arriving together at the detector. Of course, their finding could have been an accident: four neutrons might just have arrived in the same place at the same time by coincidence. But that's ridiculously improbable.

Not as improbable as tetraneutrons, some might say, because in the standard model of particle physics tetraneutrons simply can't exist. According to the Pauli exclusion principle, not even two protons or neutrons in the same system can have identical quantum properties. In fact, the strong nuclear force that would hold them together is tuned in such a way that it can't even hold two lone neutrons together, let alone four. Marquès and his team were so bemused by their result that they buried the data in a research paper that was ostensibly about the possibility of finding tetraneutrons in the future (Physical Review C, vol 65, p 44006).

And there are still more compelling reasons to doubt the existence of tetraneutrons. If you tweak the laws of physics to allow four neutrons to bind together, all kinds of chaos ensues (Journal of Physics G, vol 29, L9). It would mean that the mix of elements formed after the big bang was inconsistent with what we now observe and, even worse, the elements formed would have quickly become far too heavy for the cosmos to cope. "Maybe the universe would have collapsed before it had any chance to expand," says Natalia Timofeyuk, a theorist at the University of Surrey in Guildford, UK.

There are, however, a couple of holes in this reasoning. Established theory does allow the tetraneutron to exist - though only as a ridiculously short-lived particle. "This could be a reason for four neutrons hitting the Ganil detectors simultaneously," Timofeyuk says. And there is other evidence that supports the idea of matter composed of multiple neutrons: neutron stars. These bodies, which contain an enormous number of bound neutrons, suggest that as yet unexplained forces come into play when neutrons gather en masse.

8 The Pioneer anomaly

This is a tale of two spacecraft. Pioneer 10 was launched in 1972; Pioneer 11 a year later. By now both craft should be drifting off into deep space with no one watching. However, their trajectories have proved far too fascinating to ignore.
That's because something has been pulling - or pushing - on them, causing them to speed up. The resulting acceleration is tiny, less than a nanometre per second per second. That's equivalent to just one ten-billionth of the gravity at Earth's surface, but it is enough to have shifted Pioneer 10 some 400,000 kilometres off track. NASA lost touch with Pioneer 11 in 1995, but up to that point it was experiencing exactly the same deviation as its sister probe. So what is causing it?
Nobody knows. Some possible explanations have already been ruled out, including software errors, the solar wind or a fuel leak. If the cause is some gravitational effect, it is not one we know anything about. In fact, physicists are so completely at a loss that some have resorted to linking this mystery with other inexplicable phenomena.

Bruce Bassett of the University of Portsmouth, UK, has suggested that the Pioneer conundrum might have something to do with variations in alpha, the fine structure constant. Others have talked about it as arising from dark matter - but since we don't know what dark matter is, that doesn't help much either. "This is all so maddeningly intriguing," says Michael Martin Nieto of the Los Alamos National Laboratory. "We only have proposals, none of which has been demonstrated."

Nieto has called for a new analysis of the early trajectory data from the craft, which he says might yield fresh clues. But to get to the bottom of the problem what scientists really need is a mission designed specifically to test unusual gravitational effects in the outer reaches of the solar system. Such a probe would cost between $300 million and $500 million and could piggyback on a future mission to the outer reaches of the solar system (www.arxiv.org/gr-qc/0411077).

"An explanation will be found eventually," Nieto says. "Of course I hope it is due to new physics - how stupendous that would be. But once a physicist starts working on the basis of hope he is heading for a fall." Disappointing as it may seem, Nieto thinks the explanation for the Pioneer anomaly will eventually be found in some mundane effect, such as an unnoticed source of heat on board the craft.


9 Dark energy

It is one of the most famous, and most embarrassing, problems in physics. In 1998, astronomers discovered that the universe is expanding at ever faster speeds. It's an effect still searching for a cause - until then, everyone thought the universe's expansion was slowing down after the big bang. "Theorists are still floundering around, looking for a sensible explanation," says cosmologist Katherine Freese of the University of Michigan, Ann Arbor. "We're all hoping that upcoming observations of supernovae, of clusters of galaxies and so on will give us more clues."

One suggestion is that some property of empty space is responsible - cosmologists call it dark energy. But all attempts to pin it down have fallen woefully short. It's also possible that Einstein's theory of general relativity may need to be tweaked when applied to the very largest scales of the universe. "The field is still wide open," Freese says.

10 The Kuiper cliff

If you travel out to the far edge of the solar system, into the frigid wastes beyond Pluto, you'll see something strange. Suddenly, after passing through the Kuiper belt, a region of space teeming with icy rocks, there's nothing.
Astronomers call this boundary the Kuiper cliff, because the density of space rocks drops off so steeply. What caused it? The only answer seems to be a 10th planet. We're not talking about Quaoar or Sedna: this is a massive object, as big as Earth or Mars, that has swept the area clean of debris.

The evidence for the existence of "Planet X" is compelling, says Alan Stern, an astronomer at the Southwest Research Institute in Boulder, Colorado. But although calculations show that such a body could account for the Kuiper cliff (Icarus, vol 160, p 32), no one has ever seen this fabled 10th planet.

There's a good reason for that. The Kuiper belt is just too far away for us to get a decent view. We need to get out there and have a look before we can say anything about the region. And that won't be possible for another decade, at least. NASA's New Horizons probe, which will head out to Pluto and the Kuiper belt, is scheduled for launch in January 2006. It won't reach Pluto until 2015, so if you are looking for an explanation of the vast, empty gulf of the Kuiper cliff, watch this space.

11 The Wow signal

It was 37 seconds long and came from outer space. On 15 August 1977 it caused astronomer Jerry Ehman, then of Ohio State University in Columbus, to scrawl "Wow!" on the printout from Big Ear, Ohio State's radio telescope in Delaware. And 28 years later no one knows what created the signal. "I am still waiting for a definitive explanation that makes sense," Ehman says.

Coming from the direction of Sagittarius, the pulse of radiation was confined to a narrow range of radio frequencies around 1420 megahertz. This frequency is in a part of the radio spectrum in which all transmissions are prohibited by international agreement. Natural sources of radiation, such as the thermal emissions from planets, usually cover a much broader sweep of frequencies. So what caused it?

The nearest star in that direction is 220 light years away. If that is where is came from, it would have had to be a pretty powerful astronomical event - or an advanced alien civilization using an astonishingly large and powerful transmitter.
The fact that hundreds of sweeps over the same patch of sky have found nothing like the Wow signal doesn't mean it's not aliens. When you consider the fact that the Big Ear telescope covers only one-millionth of the sky at any time, and an alien transmitter would also likely beam out over the same fraction of sky, the chances of spotting the signal again are remote, to say the least.

Others think there must be a mundane explanation. Dan Wertheimer, chief scientist for the SETI@home project, says the Wow signal was almost certainly pollution: radio-frequency interference from Earth-based transmissions. "We've seen many signals like this, and these sorts of signals have always turned out to be interference," he says. The debate continues.

12 Not-so-constant constants

In 1997 astronomer John Webb and his team at the University of New South Wales in Sydney analysed the light reaching Earth from distant quasars. On its 12-billion-year journey, the light had passed through interstellar clouds of metals such as iron, nickel and chromium, and the researchers found these atoms had absorbed some of the photons of quasar light - but not the ones they were expecting.
If the observations are correct, the only vaguely reasonable explanation is that a constant of physics called the fine structure constant, or alpha, had a different value at the time the light passed through the clouds.

But that's heresy. Alpha is an extremely important constant that determines how light interacts with matter - and it shouldn't be able to change. Its value depends on, among other things, the charge on the electron, the speed of light and Planck's constant. Could one of these really have changed?

No one in physics wanted to believe the measurements. Webb and his team have been trying for years to find an error in their results. But so far they have failed.
Webb's are not the only results that suggest something is missing from our understanding of alpha. A recent analysis of the only known natural nuclear reactor, which was active nearly 2 billion years ago at what is now Oklo in Gabon, also suggests something about light's interaction with matter has changed.

The ratio of certain radioactive isotopes produced within such a reactor depends on alpha, and so looking at the fission products left behind in the ground at Oklo provides a way to work out the value of the constant at the time of their formation. Using this method, Steve Lamoreaux and his colleagues at the Los Alamos National Laboratory in New Mexico suggest that alpha may have decreased by more than 4 per cent since Oklo started up (Physical Review D, vol 69, p 121701).

There are gainsayers who still dispute any change in alpha. Patrick Petitjean, an astronomer at the Institute of Astrophysics in Paris, led a team that analysed quasar light picked up by the Very Large Telescope (VLT) in Chile and found no evidence that alpha has changed. But Webb, who is now looking at the VLT measurements, says that they require a more complex analysis than Petitjean's team has carried out. Webb's group is working on that now, and may be in a position to declare the anomaly resolved - or not - later this year.

"It's difficult to say how long it's going to take," says team member Michael Murphy of the University of Cambridge. "The more we look at these new data, the more difficulties we see." But whatever the answer, the work will still be valuable. An analysis of the way light passes through distant molecular clouds will reveal more about how the elements were produced early in the universe's history.
Update: No such thing as a constant constant?

13 Cold fusion

After 16 years, it's back. In fact, cold fusion never really went away. Over a 10-year period from 1989, US navy labs ran more than 200 experiments to investigate whether nuclear reactions generating more energy than they consume - supposedly only possible inside stars - can occur at room temperature. Numerous researchers have since pronounced themselves believers.

With controllable cold fusion, many of the world's energy problems would melt away: no wonder the US Department of Energy is interested. In December, after a lengthy review of the evidence, it said it was open to receiving proposals for new cold fusion experiments.

That's quite a turnaround. The DoE's first report on the subject, published 15 years ago, concluded that the original cold fusion results, produced by Martin Fleischmann and Stanley Pons of the University of Utah and unveiled at a press conference in 1989, were impossible to reproduce, and thus probably false.

The basic claim of cold fusion is that dunking palladium electrodes into heavy water - in which oxygen is combined with the hydrogen isotope deuterium - can release a large amount of energy. Placing a voltage across the electrodes supposedly allows deuterium nuclei to move into palladium's molecular lattice, enabling them to overcome their natural repulsion and fuse together, releasing a blast of energy. The snag is that fusion at room temperature is deemed impossible by every accepted scientific theory.

Cold fusion would make the world's energy problems melt away. No wonder the Department of Energy is interested.

That doesn't matter, according to David Nagel, an engineer at George Washington University in Washington DC. Superconductors took 40 years to explain, he points out, so there's no reason to dismiss cold fusion. "The experimental case is bulletproof," he says. "You can't make it go away."

My comment: Perhaps this is the biggest difference between the Institution of Science and the Institution of Religion. In both cases, things happen that violate known expectations. Science calls these things anomalies, religion calls them miracles. In science, it’s a call to do more research. In religion it’s a cause to work ourselves into a state of awe—by not doing research, not asking questions. Just by standing in awe of the miracle. That is, that is the behavior of those involved in the Institutions. Lay people, the casual to avid follower of scientific research, and those deeply involved in spiritual inquiries, those not tied to the rules of the institutions can ask anything they would like, to draw implications in any direction reason and logic take them. Personally, standing in awe of a miracle doesn’t do much for me. Inquiring into the implications of the miracle, finding that the miracle is actually an illustration of a normal scientific process that occurs, for us, at microscopic or molecular level, or on a cosmological level—this does more for me, this makes me approach the Torah with caution and awe. For instance, Moses getting water from a rock is considered a miracle because, usually, rocks don’t have an internal fountain—until you realize that the earth itself is a rock with an internal fountain. The implications are that the universe may be structured along holographic principles or principles not unlike fractals. When the artificial boundaries and compartments that we use to categorize knowledge, when these walls begin to break, when the knowledge begins to fall together like a beautiful puzzle, these are powerful emotions—this is what I would term ‘awe.’