It’s great when brain science goes mainstream – understanding how our brains work goes a long way in living better lives. There’s a problem, though, when brain science goes bad. Or at least, when lots of people get it wrong, and brain myths trample reality.
As a culture, we’re paying more and more attention to the brain, wanting to understand it, talking about it. This morning I overheard a goofy morning radio show ask their “Question of the Day,” and the DJ told listeners to get their frontal lobes fired up. As a neuropsychologist (and self-acknowledged neuro-geek), those sorts of things make me cheer.
Overall, that’s good news. In a similar way, it’s good news that we casually use words and ideas Freud put into common awareness, like subconscious, anal retentive (or now, just being anal), being in denial, Oedipal complex, having a death wish, Freudian slips, and phallic symbols. (Speaking of which, since I work in Washington, DC: Ever taken a good Freudian look at the Washington Monument and the Jefferson Memorial?)
There’s a problem, though, when brain science goes bad. Or at least, when lots of people get it wrong.
For example, I kind of quietly shouted to myself during the movie trailer for Lucy, a film based on the (wrong!) idea that humans use just 10% of our brains* (and Lucy, the villain, had managed to put more of hers into use). Not true.
If you only used ten percent of your brain, you’d be dead. Or at best, comatose. Basically, all of your brain is working all of the time. So educational goals of getting people to use more of their brain’s real estate is misdirected. So is making your brain better by practicing specific tasks, as some online “brain training” programs claim — see this letter from scientists at Stanford. It’s integration of the brain we’re after (the subject of a later post). An integrated brain is a better brain.
Another neuromyth: Teaching is more effective if we use the child’s best learning style. No. While we do have different learning styles, like being a visual learner, or auditory, there’s research to show that there’s no benefit from being taught in “your” style. Research has even shown that it may be counterproductive to tailor teaching to a child’s learning style — some of us may learn better when we’re taught in a style we don’t prefer.
There’s also the myth that we should intensively stimulate babies’ brains during their first three years, to make the most of the “majority” of their brain development. After the age of three, this myth goes, the brain is pretty much fixed. This is how “Baby Einstein” and “Your Baby Can Read” duped so many well-intentioned parents.
More does not beget more when it comes to the brain (an idea that incorrectly came from a theory by an economist). Yes, there are critical periods in the developing brain, but you’re not in a race to get as much stimulation or education as possible into your little one. Our brains continue to grow and develop, refine and prune and beef up areas throughout our lives.
(And while I’m at it, what a child learns by free play — like jumping in puddles, as Neil deGrasse Tyson so beautifully put it to a six-year-old girl — is more helpful in brain development than any black-and-white mobile you hang over his or her crib, or flashcards for “reading” with three-year-olds.)
For me, as a psychologist working with adults, these three neuromyths are key.
- Use your brain in ways that help it be more integrated, not focusing on using “more” of it.
- Experiment with learning styles and experiences that are outside your “strong suit.”
- Make the most of your lifetime of neuroplasticity.
Until about fifteen years ago, neurology taught that your brain developed until about age 21, and it was all downhill from there. No. I’m glad to be debunking that myth every day, in the work I do with my patients. Neuroplasticity in the brain is now fully accepted by neuroscience, and I put that knowledge to use in every psychotherapy session I do — and I teach my colleagues about practicing “therapy with the brain in mind.”
* The “10% of our brain” neuromyth may have been tracked down to its origin by Sam McDougle, a PhD student at Princeton. According to a story aired on PRI,
While McDougle can’t pinpoint the exact origin of this myth, he says that it seems to have originated in Dale Carnegie’s 1936 self-help book, “How To Win Friends And Influence People.” The forward [sic], which was written by Lowell Thomas, misquotes a Harvard University professor who once said that humans have unused mental potential.
“The average person develops only 10 percent of his latent mental ability,” Thomas wrote.
My thanks to the soon-to-be-Dr. McDougle.