The Noose Tightens: Scientific Standards Being Raised

For those of you hoping to fly under the radar of reviewers and get your questionable studies published, I suggest that you do so with a quickness. A new editorial in Nature Neuroscience outlines the journal's updated criteria for methods reporting, which removes the limit on the methods section of papers, mandates reporting the data used to create figures, and requires statements on randomization and blinding. In addition, the editorial board takes a swipe at the current level of statistical proficiency in biology, asserting that

Too many biologists [and neuroscientists] still do not receive adequate training in statistics and other quantitative aspects of their subject. Mentoring of young scientists on matters of rigor and transparency is inconsistent at best. In academia, the ever-increasing pressures to publish and obtain the next level of funding provide little incentive to pursue and publish studies that contradict or confirm previously published results. Those who would put effort into documenting the validity or irreproducibility of a published piece of work have little prospect of seeing their efforts valued by journals and funders; meanwhile, funding and efforts are wasted on false assumptions.

What the editors are trying to say, I think, is that a significant number of academics, and particularly graduate students, are most lazy, benighted, pernicious race of little odious vermin that nature ever suffered to crawl upon the surface of the earth; to which I might add: This is quite true, but although we may be shiftless, entitled, disgusting vermin, it is more accurate to say that we are shiftless, entitled, disgusting vermin who simply do not know where to start. While many of us learn the basics of statistics sometime during college, much is not retained, and remedial graduate courses do little to prepare one for understanding the details and nuances of experimental design that can influence the choice of statistics that one uses. One may argue that the onus is on the individual to teach himself what he needs to know in order to understand the papers that he reads, and to become informed enough to design and write up a study at the level of the journal for which he aims; however, this implies an unrealistic expectation of self-reliance and tenacity for today's average graduate student. Clearly, blame must be assigned: The statisticians have failed us.

Another disturbing trend in the literature is a recent rash of papers encouraging studies to include more subjects, to aid both statistical reliability and experimental reproducibility. Two articles in the last issue of Neuroimage - One by Michael Ingre, one by Lindquist et al - as well as a recent Nature Neuroscience article by Button et al, take Karl Friston's 2012 Ten Ironic Rules article out to the woodshed, claiming that small sample sizes are more susceptible to false positives, and that instead larger samples should be recruited and effect sizes reported. More to the point, the underpowered studies that are published tend to be biased to only finding effects that are inordinately large, as null effects simply go unreported.

All of this is quite unnerving to the small-sample researcher, and I advise him to crank out as many of his underpowered studies as he can before larger sample sizes become the new normal, and one of the checklist criteria for any high-impact journal. For any new experiments, of course, recruit large sample sizes, and when reviewing, punish those who use smaller sample sizes, using the reasons outlined above; for then you will have still published your earlier results, but manage to remain on the right side of history. To some, this may smack of Tartufferie; I merely advise you to act in your best interests.

CNS 2013 Review

Last weekend marked my second attendance of the Cognitive Neuroscience Society conference, and I had a terrific time, each night full of drinking, wenching, gaming, brawling, dancing, freethinking, casuistry, and innumerable other vices. I dined on the best seafood that San Francisco had to offer, devouring breadbowls of clam chowder and pots of carmelized catfish and platters of sushi. I witnessed fights between sea lions, observed the Golden Gate bridge expand and contract in proportion to its temperature, and toured the Ghiradelli chocolate factory complex. Having access to a television for the first time in months, I watched the last half of S.W.A.T. and the first half of Face/Off, which, taken together, made for a satisfying, full-length action movie.

However, I also managed to find the time to go to some talks and posters detailing the latest findings in my field. A few trends I noticed:

1. Development and aging are hot right now, particularly since a large segment of the population is approaching old age and beginning to experience the effects of dementia, senescence, and increased irritability at perceived injustices, such as when your children fail to call you when they say that they will. I saw several posters looking at cognitive control effects over time, and how different interventions affected measures of executive function; and since the baby boomers are funding a large part of this research, so the importance of this field will continue to grow in proportion to their collective terror in the face of aging and its associated infirmities, creeping maladies seen from a distance yet unstoppable, as a man bound to a stake in the middle of a desert might feel as he is approached by irate wildlife.

2. Cognitive strategies such as mindfulness meditation and reappraisal are also hot right now; and although they might seem a bit faddish, the evidence of their efficacy is compelling. Expect to see more of these and their ilk increasingly applied across a wider variety of pathologies, such as depression, chronic pain, tinnitus, and addiction.


Lastly, while supping at the Ghiradelli chocolate factory with a postdoc, he mentioned that he was miffed by the lack of theory-driven experiments in several of the posters he saw. That is to say, several posters would lead in with a statement such as "Much research has been done on topic X. However, relatively little is known about Y...", with an experiment devoted to the effects of Y. In my colleague's opinion, this leads to a broadening of the field without refining or testing any of the existing theories. Indeed, if any think his fears to be unfounded, here I reprint an abstract seen at the conference which would appear to support his apprehensions about unfocused research endeavors:


Title: The neural correlates of pooping
Author: Will Brown, M.D., Ph.D.
Abstract: Pooping is an adaptive evolutionary behavior observed to occur across a wide range of species, including dogs, birds, armadillos, and humans, with evolutionary psychologists believing it to serve as a biomarker for fitness and reproductive success. Much research has been done on pooping, but to our knowledge this has not yet been systematically examined using FMRI. In our first study we scanned 28 participants while pooping. Robust activation was observed in bilateral pre-SMA, dorsal ACC, and bilateral insula, which we have dubbed the "pooping network". The second study scanned the same participants while they watched videos of other humans or robots either reading or pooping, creating a 2x2 factorial design. All poops were controlled for size, pungency, luminance, texture, and nuttiness, using the Jeff Goldblum Excrement Control Scale. The contrast of HumanPoops - RobotPoops was associated with activity in the superior temporal sulcus, consistent with this region's role in processing socially relevant actions. By contrast, a main effect of observing poops collapsed across humans and robots led to increased activation in the inferior frontal gyrus, premotor cortex, and parietal cortex, a finding similar to other studies investigating mirror neurons, suggesting that mirror neurons may be essential for helping organisms learn how to poop. These results better inform our understanding of pooping, and may lead to mindfulness meditation and cognitive reappraisal treatments for pooping-related disorders, such as constipation, irritable bowel syndrome, and explosive diarrhea.


Clearly, some restraint is necessary when deciding what experiments to carry out, as there are an infinite number of questions to study, but which must, from time to time, be bound together under cohesive theories.

Overall, I had a great time, and am looking forward to CNS 2014!

CNS 2013: San Francisco



Readers and fellow brain bloggers,

I will be presenting a poster at CNS this Monday from 8:00am-11:00am, session D, poster 57. Featuring electrical shocks, a spatial Stroop task, passion, love, and good and evil, these FMRI results are so hot they would defy the pen of a Sappho to eroticize them. Feel free to stop by and say hi!


As a teaser/foreplay, here is a picture of one of my results:


Stats Videos

Over the past semester, I have been using online tutorials as a supplement for my Introductory Statistics course. Partly to share my knowledge with the rest of the world, partly to deflect any questions from students by asking "Well, did you watch the video yet?" While there is the possibility that such easy access to statistical concepts may reinforce truancy, absenteeism, and sloth, as well as fostering society's increasingly pathological dependence on technology, I try to be a bit more sanguine, thinking that it will serve as a useful tool for learning and memorization.

One reason behind making these is to make sure that I actually understand the concepts; another reason is to provide a reusable source of statistics material for students who do not fully grasp everything in one class sitting (and who does?); and yet another reason is to show off my wardrobe. (Hint: I rotate about two, maybe three shirts. You can see where my priorities are.)


Tonight's Entertainment

For those of you in the Bloomington area, I'll be accompanying for a senior cello recital at the Jacobs School of Music Recital Hall. The program includes, among other pieces, the Strauss Cello Sonata. This piece is rarely recorded or performed, due to some tricky passagework for both instruments and a notoriously difficult piano part (see, for example, the stormy middle section in the following video starting at about 3:10). But, I also happen to be kind of stupid, so I play stuff like this regardless. No fear, baby!


Where: Recital Hall, Jacobs School of Music
When: Saturday, March 23rd, 10:00pm


==============Program==============

J. S. Bach: Suite No. 5 for Unaccompanied Cello in C Minor, BMV 1101

Gaspar Cassado: Suite for Solo Cello


Richard Strauss: Sonata for Cello and Piano in F Major, Op. 6

     I. Allegro con brio
     II. Andante ma non troppo
     III. Allegro vivo




Video Games Can Increase Cognitive Ability, Make You Dangerous

As an adult, when I look back on my childhood and consider the ungodly number of hours I put into video games - Command & Conquer, Counter-Strike, Diablo II, Starcraft, Halo, Legend of Zelda, Tetris, Space Quest, Myst, Civilization, just to name a few - I shake my head in disbelief at how much time I frittered away. If I had spent half that amount of time in the gym, for example, I would have been one buff mamma jamma. Instead, all I have to show for it are several deeply ingrained but practically useless motor reflexes - such as the ability to buy an AWP and two flashbang grenades along with kevlar (but no helmet) in record time - and perfect recall of lines of dialogue from Metal Gear Solid and the finer plot points in Final Fantasy VII. In addition, I also developed a thick skin in response to the flood of insults, invectives, put-downs, and unending verbal abuse from other online players with names like AznMaGiC and Legali$e_iT. At the time, all of this seemed incredibly important; now, not so much.

However, a recent study suggests that this may not have all been an entire waste; and that video games relevant to specific cognitive processes, such as attention, spatial working memory, and decision-making, can actually improve these functions and show crossover to different cognitive domains.

A research group from Singapore recruited a sample of non-gamers - people I once would have looked down upon with contempt - and had them play a variety of different games, such as Bejeweled, Hidden Expedition, and The Sims (remember that?). The participants were tested on a battery of tasks tapping into abilities such as working memory and filtering out distracting stimuli. After completing these measures, subjects were randomly assigned to play one of five games (for a total of five groups), and to play that game for an hour a day over the course of four weeks. The subjects were then retested on the same tasks as before.

Participants improved markedly on those tasks most related to the game that they played - for example, those who spent time playing a memory matrix game showed significant improvement in the working-memory task. However, there was some cross-over between tasks as well, such as action-game playing associated with both improved filtering of task-irrelevant stimuli and increased ability to track multiple objects. Overall, the findings corroborated other studies focused on habitual gamers who primarily played first-person shooters, but showed that these improvements could be extended to non-gamers and to non-violent games. Taken together, this suggests that daily video game practice can both improve your cognitive abilities, and also make you a more efficient hunter of the most dangerous prey of all - man.

So, should we all start shelling out more money for video games and less money for books, poetry, and exercise equipment, such as whiffle balls? Not quite. First, there was no mention of how long the effects lasted - whether it was just for the post-test, or whether the effects could last for weeks or months. Second, although there was a considerable interval between the pre-test and post-test (one month), since this was a repeated-measures design, there is the possibility that there may have been some carryover effect; i.e., some effect of practice from the pre-test. Although there were significant differences between groups, it is unclear how this was affected by the testing before the training period. Lastly, there was no mention about playing an hour of video games a day affected other aspects of life, such as proportionally less time devoted to exercising, social interaction, and your girlfriend getting pissed off that you forgot to pick her up for your two-year anniversary date because you got so caught up in one of those three-hour Metal Gear Solid cutscenes that time seemed to stand still. Chicks, they'll never understand. (Guys, amiright?).

The paper can be found here. And, in case you were wondering, deals for the latest Starcraft II expansion can be found here.



Birthday Post

A strange thing happened to me in my dream. I was transported to the Seventh Heaven. There sat all the gods assembled. As a special dispensation I was granted the favor to have one wish. "Do you wish for youth," said Mercury, "or for beauty, or power, or a long life; or do you wish for the most beautiful woman, or any other of the many fine things we have in our treasure trove? Choose, but only one thing!" For a moment I was at a loss. Then I addressed the gods thusly: "Most honorable contemporaries, I choose one thing — that I may always have laughter on my side." Not one god made answer, but all began to laugh. From this I concluded that my wish had been granted and thought that the gods knew how to express themselves with good taste; for it would surely have been inappropriate to answer gravely: Your wish has been granted.

Solid Plan: Study on Ethics Is Badly, Even Laughably, Plagiarized



RetractionWatch, a website dedicated to covering journal retractions, plagiarism, and all other types of skullduggery and chicanery, recently published a plagiarism incident involving a study about morality. Ironic? Undoubtedly. But the reason we can all enjoy a good chuckle over it is because it was abysmally executed, resembling a botched carryover job eerily reminiscent of the Chinese-English translation of Revenge of the Sith. Observe, for instance, the opening sentence of both papers:

The original Nosek et al abstract:
Moral dilemmas pitting concerns about actions against concerns about consequences have been used by philosophers and psychologists to gauge “universal” moral intuitions.
Compared against the plagiarized version:
Ethical enigma kernelling concerns about actions against concerns about consequences have been dealt by philosophers and psychologists to measure “universal” moral intuitions.
Note the quotation marks around "universal", suggesting that there are no real universal moral standards, and that we should instead get used to living in a fluffy, wishy-washy moral universe where nothing is really good or bad, just different - which, in my opinion, is merely a way to shut out the voice of conscience, and exonerate people like your roommate who ate the last of the hotpockets in the freezer without even asking. No wonder society is on the decline.

Anyhow, things went downhill from there, with an almost word-for-word lifting in the results section, and the convoluted title: Political Dogma Stroll’s Non-Political Moral Decision-Making Processes – A Quantitative Analysis of Ideological Decision-Making of Liberals and Conservatives in the Western Europe.

I don't know what a "Political Dogma Stroll" is, but dibs on the band name.


Thanks to Jay Van Bavel, who once hit two home runs in slow-pitch softball - going the other way

Are Scientists Self-Critical Enough?



There is a widespread misconception among the American public that scientists and academics in general are smug, self-satisfied, abrasive, entitled, obnoxious, over-opinionated weenies who, for all of their competence in their field of study, can be surprisingly out of touch with ordinary realities, such as the basics of personal hygiene, how not to be socially awkward, or how to calculate a tip after dining out; and, furthermore, that their hyper-specialized environment fosters a profound ignorance of anything beyond their ken which inevitably leads to an atrophy of the non-specialized mind, the lack of which may be leading us to disaster. (Contrast this with Goethe, one of several intellectuals during the Enlightenment era who had good reason to believe his scientific accomplishments would be better remembered than his literary ones.) Despite these criticisms, I assure you that nothing could be further from the truth; that although academics have their fair share of shortcomings and peccadilloes, on the whole they are the kindest, bravest, warmest, most wonderful human beings I have ever known.

However, it is clear that we still need a healthy dose of critical self-evaluation every now and then. A recent opinion piece by Colin Macilwain in Nature highlights how self-criticism is necessary for the healthy operation of any industry, in order to highlight the bad as well as the good. For example, although within the United States federally-funded research has skyrocketed in recent years, there has been little systematic investigation into the effects of this spending; in short, whether Americans are actually getting a return for their dollar. This has parallels with the amount of education spending within the past few decades, which, although mind-bogglingly high, has not led to proportional increases in standardized test scores or improvements in rankings compared to countries who spend less. The argument that it would be worse otherwise is tenuous at best.

So, is a good, hard look at ourselves really what we need? We might take an example from Wordsworth, whose wellsprings of self-criticism appeared to dry up halfway through his career, and so spent the remaining few decades writing verse he never should have started, the vast collection of his later work serving as a sort of monument to decayed genius. A recurrent theme with either individuals or societies faced with an embarrassment of riches and the unceasing flattery of their colleagues is a failure to occasionally step back and objectively evaluate oneself, which eventually leads to complacency, and, finally, decadence.

Some may attribute this lack of self-criticism to the recent homogenization of opinion within academia, which at one point was supposed to be a marketplace of ideas - no matter how eccentric - but where recently opinions on everything from religion to politics have become remarkably predictable, with miniscule variations in opinion, always in degree but never in kind, serving as a kind of ersatz dissent where one can still feel the thrill and righteousness of disagreement with none of the discomfort that comes with having one's worldview fundamentally challenged. Here I must disagree, as since we have committed to making steady progress toward truth throughout our species' history, it is necessary to discard incorrect opinions and theories as they come up, and so we must feel no compunction (ethically speaking, of course), to extirpate, smother, stifle, and suffocate contrary views which would seek to undermine opinions that have been empirically validated and widely held by the majority of intelligent individuals. Mill's views on this were, I am afraid, sorely misguided.

One Weird Trick to Boost Brain Connectivity

Answer: Take an LSAT course.

At least, that seems to contribute to changes in resting-state functional connectivity between distinct brain regions, according to a new study in the Journal of Neuroscience by Mackey et al (2013). The researchers took two groups of pre-law students and divided them into a training group, which was taking an LSAT prep course, and a control group, which intended to take the LSAT but was not enrolled in the prep course. After matching the subjects on demographics and intelligence metrics, functional connectivity was measured during a resting state scan (which, if you remember from a previous post, is a measure of correlation between timecourses between regions, rather than physical connectivity per se).

Taking the LSAT prep course was associated with increased correlations between the rostro-lateral prefrontal cortex (RLPFC; a few centimeters inside your skull if you take your finger and press it against your forehead just to the lateral side of your eye) and regions of parietal cortex (located more to the rear of your skull, slightly forward and lateral of the bump in the back of your head). The RLPFC seems to help integrate abstract relations, such as detecting flaws in your spouse's arguments, while the parietal cortex processes individual relations between items. Overall, when they combine forces, as shown by a concomitant increase in functional connectivity and test scores, your LSAT skills become unstoppable.



The parietal cortices and striatal regions, particularly the caudate and putamen nuclei, showed a stronger coupling as a result of taking the prep course; presumably because of the strong dopaminergic inputs from the basal nuclei and striatum, which emit bursts of pleasure whenever you make a breakthrough in reasoning or learning. This should come as no surprise to classical scholars, as Aristotle once observed that the two greatest peaks of human pleasure are 1) thinking, and 2) hanky-panky. (Or something like that.)

Taken to the extreme, these results suggest efficient ways to manufacture super-lawyers, or at least to strengthen connectivity between disparate regions, and alter resting state dynamics. This touches on the concept of neuroplasticity, which suggests that our brains are adaptive and malleable throuhgout life, as opposed to traditional views that cognitive stability and capacity plateaus sometime in early adulthood, and from there makes a slow decline to the grave. Instead, regularly engaging your thinking muscles and trying new cognitive tasks, such as mathematics, music, and fingerpainting, as well as grappling with some of the finest and most profound philosophical minds humanity has produced - Kant, Kierkegaard, Hegel, Nietzsche, Andy's Brain Blog, et alia - will continue to change and transmogrify your brain in ways unimaginable and calamitous beyond reckoning.


Thanks to Simon Parker. (LSAT professors hate him!)