21 April 2012

The Top Five Brain Misunderstandings That Will Drive a Neuroscientist to the Brink

I was reminded by a comment on my last post that there are a lot of common misunderstandings out there about the brain.  So here they are, the top five TOTALLY RIDICULOUS things said about the brain that make me want to fly into a homicidal rage when I see them being propagated in popular media:


#5. To grow old is to grow senile.

This one is kind of personal because my research is on human aging.  My grand design when I was fifteen was to cure Alzheimer’s disease, so I’ve been cogitating on it for a while.  The truth is that there is a normal, healthy aging process, and there are a host of separate pathological aging processes that unfortunately tend to get lumped in with healthy aging as What Happens To You When You Get Old. 

Diseases like Alzheimer’s and Parkinson’s aren’t normal.  Dementia is not – I repeat NOT – something that happens to everyone.

This isn’t to say that healthy aging doesn’t involve some declines in certain cognitive functions.  First and foremost, the speed of information processing decreases (meaning older adults are simply slower to do things than they used to be, and we young things must cultivate some patience).  Also, older adults have also been reported to show declines in focused attention, meaning they can be more easily distracted.  Finally, older adults tend to show moderate declines in some aspects of memory – for example in remembering certain names, or where the keys have run off to, or coming up with the right word  in a sentence.  Healthy aging is NOT, however, associated with the profound impairments of memory seen in diseases like Alzheimer’s, which at its worst robs people of the ability even to remember who they are.  See the difference?

Older adults also show improvements with age in other cognitive abilities.  Vocabulary and other crystallized knowledge (i.e. knowledge of facts) both increase throughout the lifespan.  Empathy and the ability to reason emotionally and socially are also far easier for older adults than young adults.

We are a society that fears aging and death, so much so that even words like “old” and “elderly” have acquired a negative connotation.  And misunderstandings like #5 here serve only to help propagate this fearful, antagonistic sentiment toward older people.  News flash – unless you plan to jump off a bridge at age forty, you’re probably going to reach old age someday.  It’s a stage of your life, just like childhood or adolescence or middle age.  And, according to a lot of the people I’ve interviewed at my job, it can be totally awesome if you let it.


#4. You are born with all the brain cells you will ever have.

First off, I just want to tackle this notion of being born with all of anything.  In and of itself, that idea is kind of silly, because I think we all recognize that the version of you that existed a day before the miraculous moment you were born is just about the same as the version of you that existed the day after – minus that whole breathing and eating through your mouth instead of your bellybutton thing.  It’s not like your fetal body is busily building and building right up until you pop out of your mother’s vagina and then all of a sudden you’re in decay mode, you know?

So with that aside, I’m here to tell you that not only are you not done making neurons when you’re born – you’re not done when you hit adolescence, or adulthood, or even that dreaded old age.  You are probably done when you’re dead (but then again, that’s leaving aside the whole philosophical argument about exactly when that other mysterious life-capping moment actually happens and whether your cells still do their thing for a few hours after you’ve reportedly kicked the bucket).

The word we’re looking to investigate here is neurogenesis, or the creation of new neurons.  For a long time we couldn’t find any brain regions that continued to make new neurons on into adulthood, but this was mostly a problem of detection – we didn’t have the right tools to find neurogenesis, so of course we assumed it never happened.  Now that we can properly search for it, neurogenesis is cropping up everywhere – in our friend the hippocampus, the cerebellum, any number of cortical regions – and it appears to be a very important part of continued brain maintenance and function.  (Actually, there’s a really cool story here about hippocampal neurogenesis that was still sort of hand-wave-y last time I checked, but even my two fellow brain researchers fell asleep during the talk we went to about it, so I’ve elected not to share it with you fine people.  Count your blessings.)

Here are some fun facts about your brain growth after birth.  Your head is disproportionately big when you’re born, but it’s not as big as it’s going to be when you grow up (otherwise your mum would be complaining a hell of a lot more than she already is).  That increase in brain size from birth to adulthood comes with a slight increase in the number of neurons you have, but more importantly a large increase in the number of connections they make.  In childhood your neurons shoot out all sorts of projections all over the place, and then during adolescence your brain goes through this massive pruning binge to take out the connections that aren’t doing you any good.  Throughout your teenage years your neurons continue to gain efficiency as they get wrapped in layers of cell membrane called myelin, which is required to allow neurons to effectively propagate their signals.  Therefore, your brain isn’t even fully developed until your early twenties.  You heard me right – your early twenties.  And boys, your brains take a couple years longer to get there than girls’ do.  Explains a lot, doesn’t it?

You can tell this is an old chart because the “cell birth” line (among others) doesn't keep on going out through adulthood.  Also, see how we’ve known for a long time that myelination and synaptic elimination and pruning continue into adulthood.

The upshot is that at this very moment, your brain is still making brand-new neurons.  It’s a very modest amount compared with the total number of neurons in your whole brain, true, but new neurons are still being born.  It’s happening right now.  And it’s going to keep on doing that for a very long time, hopefully.


#3. Smoking copious amounts of pot has no lasting detrimental effect on your brain.

Seriously?  Are you high?  First off, if you’ve ever met a chronic pot smoker, you already know this isn’t true.  You know it.  I shouldn’t even have to back this up with studies, but I’m going to anyway so we never have to have this argument again.

Here’s a very select smattering of the results from papers published since 2010 alone: (A) Chronic pot users perform more poorly on measures of executive function (that means planning, reasoning and decision-making) than non-users, and this effect is worse for those who started before age 16; (B) rats given cannabinoids in adolescence showed reversible impairments on many cognitive tasks but irreversible deficits in short-term memory measures; (C) chronic pot use has been associated with reduced hippocampal volume (although, to be fair, having skimmed that paper I have some concerns about the methods – for a better review of the effects of pot on brain metabolism and structure I encourage you to look at this paper); (D) and luckily, treatment for pot addiction with gabapentin significantly reduced pot use and increased performance on cognitive tests.  Hell, if you want a recent review of cognitive deficits associated with pot use, just read this paper.

People make similar arguments about alcohol, and I’m here to say that large enough quantities alcohol will totally mess with your head over time.  (This one I have some personal experience in!  I swear I used to be quicker than I am now, and I’m blaming vodka-tonics.)  I won’t take you through a lit review here, I’ll just leave it at Korsakoff’s syndrome.  If you think that’s unfair, fine, I’ll tell you all about run-of-the-mill alcoholics’ impairments in perceptual-motor skills, visual-spatial function, learning and memory abilities, abstract reasoning and problem solving.  Don’t make me go there.

Please don’t try and justify your habits by saying they don’t affect you.  It’s a no-brainer (pardon the expression) that excessive amounts of intentionally mind-altering substances will, over time, affect the brain and its function.  I’m sorry, but that’s just the way it is.  Just embrace the fact that you’re willingly damaging yourself and move on.


#2. Pretty much anything movies ever say about brain disorders and treatments.

As a writer, I struggle with convincingly portraying anything about which I know absolutely nothing.  I understand the plight.  But let me give you three of my favorite examples that will let you know just how disastrously misguided most movies and shows (and news outlets) are about the brain:

(A) One episode of Boston Legal starts with William Shatner’s character animatedly yammering away in an MRI machine spouting off baseball statistics for a pair of scientists who are quizzing him while watching “brain activation” blobs flit across a still image of his brain.  The scene cuts to a doctor’s office where the neurologist (?) informs him he has Alzheimer’s disease.

PROBLEMS (well, some of them): You cannot move more than a few millimeters in an MRI scanner or the image will be totally messed up; No one watches brain activation patterns real-time because (a) it’s not feasible and (b) some guy watching a bunch of blobs real-time isn’t remotely as valid as running actual statistics on the data; THIS IS NOT THE WAY YOU DIAGNOSE ALZHEIMER’S and it’s a disgrace to imply that it is; Using an MRI to detect Alzheimer’s is only now becoming a realistic possibility but it hasn’t reached clinicians yet and also people are focused not on brain function but structure; Semantic information (like baseball statistics) is one of the few things that’s preserved in early-to-moderate stages of Alzheimer’s; You would never ever tell someone, “the tests show you have Alzheimer’s disease,” because Alzheimer’s can’t actually be diagnosed until you are dead (while you’re still alive you really have probable Alzheimer’s), and a good doctor would break that news more gently and with a caregiver around to help assimilate what the doctor is telling a probable Alzheimer’s patient.

(B) In my favorite example from House, House has been in a bad situation he can’t remember because he was blitzed out on substances at the time.  He gets his buddy to perform deep brain stimulation to jog his memory.  The first jolt allows him to see a fuzzy, silent, black-and-white still image or two, so he says to crank up the juice – and then suddenly he can see the full memory playing like it’s a proper modern movie.

PROBLEMS: I don’t even know where the start.  The House show always pulls out a single term like deep brain stimulation (DBS) and makes a total mockery of it because apparently not a single writer on that show has ever even taken a single class in medical school.  So to summarize a bare minimum of points – first off, your brain simply does not behave in this way when you electrify it.  When electrically stimulated, you will probably perceive some anomalous sensations, but you will not replay the one memory you’re looking for, and it will not be a 1920’s silent film at low voltage and a modern movie a high voltage.  Also, memories do not ever play back perfectly like that for you anyway, and you all know that.  Finally, DBS would never be used for such a purpose – mostly it’s an extreme therapeutic option for people with Parkinson’s disease, severe depression, and certain other disorders, and no one has a good handle on why it works.

(C) Any movie in which “getting amnesia” means “losing all knowledge of yourself and sense of who you are.”

PROBLEMS: This phenomenon of losing yourself is real, and is called a dissociative fugue.  It is psychological in nature and usually occurs in response to a deeply traumatic situation (not a blow to the head).  It’s also transient, lasting a few days or weeks at most.  Amnesia is a totally different thing, best characterized by the movie Memento.  It can be caused by head injury, surgical resection of temporal lobe areas, oxygen starvation, and a number of other insults.  When a neuroscientist (or even Wikipedia) talks about amnesia, by default they mean anterograde (forward-looking) amnesia, in which patients cannot form new memories (remember HM…?).  These patients often, but not always, also show retrograde (backward-looking) amnesia, meaning that they cannot remember past events.  Amnesia is not transient.  These patients will live their whole lives in never-ending cycles of few-minute increments.  Also, they will never forget facts about themselves like their name and birth date and where they grew up, because this is a different type of memory unaffected by amnesia.  This whole movie problem is really one of semantics – how freaking hard would it have been for that first major movie exec to do a little research and say, “Hey, this isn’t amnesia, it’s a fugue – let’s say this guy has a fugue instead”?  They could have avoided decades of ridiculous misunderstanding! 

And let me just make this clear: You will never – under any circumstances – get hit on the head, lose all memory of yourself and your past, and then regain it miraculously a few weeks later.  Never.

(Update: Dammit, okay, I thought of a circumstance.  If the blow to the head is part of a deeply traumatic situation, you could enter a fugue state - but that's not due to the head bump! My point stands.)


#1. You only use 10% of your brain – imagine how much smarter you would be if you used 100%!

You know what happens when you use 100% of your brain?  EPILEPSY.  Freaking epilepsy.  That’s the definition of epilepsy.  If you want to have a gran mal seizure, go right ahead and “use 100% of your brain” all at once.

This “10%” notion gained credence because when we first started imaging human brains (using methods like MRI), researchers found that the little colored blobs lighting up during performance of any given task covered only a small portion of the total brain.  Why?  Because different parts of our brains do different things.  During verbal tasks, language areas light up; during reasoning tasks, reasoning areas light up… you get my point.

Suggesting that we should find a way to use 100% of our brains is like suggesting we should use 100% of our cars when we drive them.  It would obviously be totally efficient if your car drove forward and backward at the same time in all gears while also honking the horn and wiping the windshield and flashing the lights and blaring every possible radio station and opening and shutting the doors.  That’s clearly the best way to get to your local grocery store.  

Our brains are, in some ways, no more than a conglomeration of specialized little parts which each do their own mental tasks.  At any one time, you “use” about 10% of your brain to—

No, you know what?  I can’t even say that oversimplification in good taste.  This whole “10%” thing is 100% bullshit.  What does that even mean, “using 10% of your brain”?  I dare anyone to quantify with our current methods (A) how much of our (B) brains we (C) use at any one moment.  All three of those points need a better definition before you even start down that road, and anyone who takes the first step on it is asking exactly the wrong question based on a host of incorrect assumptions – for one, that the brain is a singular computational entity operating at X capacity like a damn CPU.  The idea is utter crap on every possible level.

Actually, that CPU bit brings me to an important historical point – brains have always been, by default, equivalent to whatever latest technology is hot at the moment.  In this day and age, we equate it part and parcel with a computer.   


It’s such a handy metaphor, right?  Like they were made for each other.  You know what people said your brain was like in the old days?  A switchboard.

 Extra points to the kids who know what a switchboard is.

We don’t have a good sense for what the brain is really doing, so we associate it with metaphors and then generate silly notions based on what we know about those metaphorical devices.  The point is this: Our brains are well optimized just the way they are, using different bits for different jobs.  Just know that you do NOT want all your neurons to fire all at once.  You will immediately die.

4 comments:

  1. Ahahahahah! "Suggesting that we should find a way to use 100% of our brains is like suggesting we should use 100% of our cars when we drive them." I LOVE this comparison.

    As much as I enjoy watching House (most seasons), the incorrectly applied/used medical terminology also drives me bonkers. NCIS is even worse.

    ReplyDelete
    Replies
    1. And CSI. Though again, NCIS is worse, I can't watch either of them without getting aggravated by something they're doing TOTALLY WRONG with a mass spectrometer or some spectrum or another. House I watch in blissful ignorance, however.

      Delete
  2. Okay, so I knew that brains didn't fully mature until the mid 20s because of all the connections, but I honestly had no idea that neurogenesis could occur in adulthood! Is that a new-ish discovery? I only took a couple basic neuroscience courses a year or two ago, and my professors pretty much went with the idea that once you've past childhood, you make no new neurons, it's just the connections that increase.

    ReplyDelete
    Replies
    1. It's pretty new - last five or ten years it's been getting big. It's not well understood yet what its primary function is.

      Delete