All posts by Grigori Guitchounts

Runaway Selection in Birds of Paradise

I watched a program on PBS the other night about birds of paradise – exotic birds from New Guinea with elaborate displays. To attract females, males have evolved these intricate feathers and courtship dances and rituals. A Parotia male, for example, will clear out a dancing ground and when a female is in sight, he will puck up feathers around his chest into a sort of collar similar to those of Italian nobility of the Renaissance (or perhaps closer to a ballerina’s skirt) with bright iridescent feathers forming a shield below his neck, the long quills on his head that usually point lazily toward his rear will stick straight up in a semicircle around his head. The most impressive part is the actual dance: with every feather aroused, he goes into a trance, shaking his head left to right, bouncing up and down on his feet, and encircling his audience, one lucky female who judges the performance and decides if the male is a worthy mate.

Such displays by males of hundreds of bird species, each unique and captivating are the result of millions of years of evolution, with each generation ensuring the propagation of the best displays. The best displays, in turn, are supposed to convey fitness – how successful the male’s offspring will be and how good of a father he will be. In tough climates, where food resources are scarce and predatory pressure is fierce, most animals evolve to survive by being the best at finding food and hiding from predators. In New Guinea, where most of the species of Birds of Paradise are found, the birds have for millennia enjoyed rich nutritional resources in the dense rainforest and limited pressure from predators. This easy lifestyle has allowed extravagant features to evolve, features that have nothing to do with actual fitness and in some species would be a handicap in a mano-a-mano situation with a predator.

Like the birds of paradise, we humans have enjoyed a relatively pressure-free evolutionary existence in the past few hundred years. This timeframe is not relevant for macroscopic evolutionary changes, but the idea does make me wonder what kind of runaway selection evolution may endow humans with. Meanwhile, here is a video of the Parotia’s and Superb Bird of Paradise’s courtship dances:

 

A couple of weeks ago in the New York Times, David Ewing Duncan wrote an article, “How Science Can Build a Better You,” describing a brain-machine interface called Braingate that supposedly uses a tiny bed of “electrons” to read out brain activity. Scientists recently described this device’s ability to decode neural signals to control a prosthetic arm; this and other devices promise to restore mobility in paralyzed or tetraplegic patients.

However, the Braingate device actually used an array of electrodes rather than electrons. An electron is a subatomic particle that carries negative charge; the flow of electrons is the basis of electrical stimulation. Electrodes, on the other hand, are wires that measure changes in electrical potential.

While the spelling difference is trivial, the semantic error is significant. Writing about science is a challenge for those who have no training in science, as is copy-editing; the complexity of science should require journalists to reach a level of expertise in their field before bringing their reports to the world. On the opposing side, American readers should have the basic education in science to know the difference between electrodes and electrons, and should not be at risk of being branded as nerds for pointing out such mistakes. Investment in early childhood education is critical for basic science knowledge, and the upcoming presidential election will determine if Americans choose “electrodes” over “electrons”.

Lost Thoughts in the Wake-Sleep Transition

I’ve been meaning to write about this curious phenomenon I experience every time I go to sleep. Lying in bed last night, I was thinking about a movie I had just finished watching – The Aviator, a great movie! – and was overcome by a sudden frustration: some idea that was running through my mind had simply vanished, to be replaced by something silly and mundane. Trying desperately to remember what I had just been thinking about, I could find no trace of my thoughts. It was as if they were never recorded. This happens several times, until I finally give up and fall asleep. Even more perplexing is that I am aware of those lost thoughts, I know something is missing. I just can’t remember what it was.

If these aren’t freak phenomena, one can imagine something in the awake-sleep transition that messes with short term memory. It’s as if whatever network or assembly representing the would-be memory doesn’t undergo short term plasticity necessary to “solidify” those connections. This is of course overly simplistic and probably misleading language, but a way to think about it. Perhaps this can (or has been?) analyzed in rats, as in the “replay” or reactivation activity in hippocampus of experienced events, during sleep, as in this paper by Matt Wilson of MIT. One could examine lost thoughts in the awake-sleep transition by looking at the temporal structure of activity during that transition vs. same activity during experience on a maze, for example. Perhaps this loss of thought depends on some subcortical “kick” that’s absent during sleep. Just a thought.

The Daily Show aired a special report by Aasif Mandvi on “an expensive lesson about bringing fish back to life,” or the dangers of leaving children with the capability to make purchases on the Apple App Store. The point of this story is that children can’t inhibit behavior as well as adults can due to their underdeveloped frontal cortices; and are therefore vulnerable targets to those whose sole purpose is to make easy money, not unlike drug dealers selling to addicts who just can’t help themselves:

The Daily Show With Jon Stewart Mon – Thurs 11p / 10c
Tap Fish Dealer
www.thedailyshow.com
Daily Show Full Episodes Political Humor & Satire Blog The Daily Show on Facebook

Christopher Hitchens writes in the January edition of Vanity Fair about what he believes to be a nonsensical maxim: “What doesn’t kill me, makes me stronger.” Hitchens is suffering from esophageal cancer, the primary reason for the sentiment that he is not becoming “stronger,” but is rather on a terminal decline.

The phrase is attributed to Nietzsche, whose mental decline late in life, Hitchens notes, probably did not make him any stronger. Nor did the philosopher Sydney Hook consider himself stronger after a terrible experience in a hospital. Hitchens considers himself to be among the many who don’t conquer illness to come out stronger. But there is a flaw in this reasoning – the first condition to becoming stronger is to not be killed. Hitchens is thankfully still alive and kicking (i.e. writing), but he hasn’t defeated his cancer (yet, hopefully); it is only after the cancer is over with that Hitchens can say he’s stronger or weaker. Now is premature. The more important qualification is that “stronger” should mean mentally stronger, not physically. Diseases that target the mind specifically, like Nietzsche’s syphilis, should be discounted; all others should hopefully be an exercise for the power of will and mental fortitude.

Whenever you think life is hard, remember Hitchens and countless others who brave horrible diseases. Stay stark, Hitch!

Hitchens’s essay may be found here.

Science, Religion and Values: Magisteria Redefined

Science and religion have been archenemies for some time now, with one on a quest for knowledge and truth, and the other seeking to fill a perceived void of meaning in lives. Logical inspection confirms the two systems are incompatible with one another, since science requires evidence for all claims, whereas religion insists on faith when there is no evidence whatsoever. But many do have both science and religion in their lives. How do they deal with the conflict? Stephen Jay Gould wrote in a 1997 essay on the non-overlapping magisteria, NOMA, that there actually is no conflict between science and religion:

“No such conflict should exist because each subject has a legitimate magisterium, or domain of teaching authority—and these magisteria do not overlap (the principle that I would like to designate as NOMA, or “nonoverlapping magisteria”).

The net of science covers the empirical universe: what is it made of (fact) and why does it work this way (theory). The net of religion extends over questions of moral meaning and value. These two magisteria do not overlap, nor do they encompass all inquiry (consider, for starters, the magisterium of art and the meaning of beauty). To cite the arch cliches, we get the age of rocks, and religion retains the rock of ages; we study how the heavens go, and they determine how to go to heaven. Continue reading

Brainy Computers

“We’re not trying to replicate the brain. That’s impossible. We don’t know how the brain works, really,” says the chief of IBM’s Cognitive Computing project, which aims to improve computing by creating brain-like computers capable of learning in real-time and consuming less power than conventional machines. No one knows how the brain works, but have the folks at IBM tried to figure it out?

It seems strange to say that it’s impossible to replicate the brain, especially coming from a man whose blog‘s caption reads, “to engineer the mind by reverse engineering the brain.” Perhaps I’m picking at his words – replicating and reverse engineering are totally different things; to replicate is to copy exactly, while reverse engineering isn’t as strict, since it’s concerned with macroscopic function rather than microscopic structure. But of all the things that seem conceptually impossible today, it’s the “engineer the mind” that’s the winner, especially if one can’t “replicate the brain.” The chances of engineering a mind are greater the closer the system is to the brain; that’s why my MacBook, to my continual disappointment, does not have a mind.

These little trifles haven’t stopped Darpa from funding IBM and scientists elsewhere. IBM now boasts a prototype chip with 256 super-simplified integrate-and-fire “neurons” and a thousand times as many “synapses.” This architecture is capable of learning to recognize hand-drawn single-digit numbers. Its performance may not be optimal, but still impressive considering the brain likely allocates far more neurons (and far more complicated neurons) to the same task. On another front, the group reported using a 147,456-CPU supercomputer with 144TB of main memory to simulate a billion neurons with ten thousand as many synapses. Now if only they could combine these two efforts and expand their chip from two hundred to a billion neurons.

Dancing for Science

Science is difficult to understand and even more difficult to explain. John Bohannon thinks that words are inept at explaining scientific concepts, and should stay out of the way. Powerpoint is useless too. Instead, Bohannon argues, scientific concepts should be explained with dance. He foresees a boost to the economy if dancers were to be hired as aids to presenters, not only because those dancers would have jobs, but because science would be communicated more effectively, leading to more innovation.

Bohannon presents these ideas in an engaging TEDx talk, with the help of the Black Label Movement dance team.  No doubt, seeing people dance out cellular locomotion is fun and more straightforward than hearing a verbal description of the same thing. I wonder though if such concepts would be more accurately portrayed and easier to understand through animations. Perhaps there is something about seeing people perform live that is more engaging than seeing animations or the same performance on a screen. If that’s true, then having dancers at one’s presentations would be very helpful (it would also make that presentation stand out, if no one else has dancers).

 

 

Descriptive vs. Predictive Models

When we look back at the important advances in neuroscience in the 20th and 21st centuries, what will we remember? What will we still find useful and worth pursuing further? The field is still in its nascent stages, even a century after Ramon y Cajal showed evidence for the neuron doctrine, establishing the neuron as a fundamental unit of the nervous system; and Brodmann published his cytoarchitecture studies that convinced the world that the brain is divided into distinct areas and likely uses those to divvy up processing. Yet we still have virtually no clue how the brain works: there is no central theory, no cures for brain diseases; only a whole lot of curious, enthusiastic and optimistic minds and some funding to help them get stuff done.

And it is rightly so that some neuroscientists have serious physics Continue reading

Whale Song Analysis Crowd-sourcing

Scientific American is collaborating with marine scientists on a project to crowd-source analysis of whale songs and calls. Having gathered thousands of sound files from many species of whales, scientists now need to classify each call and song to get an understanding of each specie’s repertoire. Once the calls and songs are sorted and classified, scientists can pursue interesting questions like, is a whale’s song repertoire related to its intelligence?

To classify the vocalizations, scientists are asking the public for help. On whale.fm, anyone (no expertise required) can sift through some spectrograms and embedded sound files, and match them to a template. It’s easy, fun and cool. Something that would take one person months or years to do, can now by accomplished much faster by the public in a fun format.

Some previous efforts in scientific crowd-sourcing like FoldIt, a game in which people fold proteins based on simple rules (computers can’t do this), or the search for new galaxies by amateur astronomers from images taken by the Hubble telescope. Perhaps this type of effort could help the Connectome efforts to map out the brain down to each synapse using electron microscopy, where every neurite in a cross-sectional image must be strung to itself in adjacent images. Tracing axons across thousands of EM images could actually make a fun and productive game.