30 April 2012

Another Five Abominable Brain Myths

The day after I posted my first list of five brain misunderstandings that make me cringe, I remembered another cringe-worthy myth and have therefore been forced to compile an additional list.  For the record, #1 on this list really should have been #2 on the Top Five…


#5. Drugs will put holes in your brain.

Sure, hardcore drugs can kill you, but they don’t do it by putting holes in your brain.  I’m pretty sure D.A.R.E. is primarily responsible for spreading this myth, at least according to a few of my fellow nineties kids. 

First, let’s tackle the bit about what a “hole” is.  Generally speaking, the fastest (and only) way to get an actual hole into your brain is with a bullet or a tamping iron or something (see, for instance, Phineas Gage).  Otherwise, what one might call a “hole” is most often really a region of damaged brain which, while damaged and nonfunctional, is still packed with all sorts of fluids and tissue and whatnot. 

Your brain is a very densely-packed organ full of cells, bathed in a fluid called cerebrospinal fluid, encased in a skull.  If you endure damage to the brain that doesn’t open up the skull in the process, then the damaged brain regions are still going to be filled with all that kind of stuff.  If you happen to see an MRI of such a lesion (say, from a stroke or a tumor or something) it might look like a dark space in an otherwise bright brain, but all that means is that the water moving around in that region is not neatly organized like it is in the rest of the brain.

So keeping that caveat in mind, not too many drugs even damage your brain in a way that would create proper lesions like the kind described above.  In fact, I couldn’t come up with one.  Let me give you a list of popular drugs that certainly don’t put holes in your brain even when abused: heroin, cocaine, meth, pot, alcohol, ecstasy, LSD, prescription pills, mescaline, bath salts, roofies… I think I’m starting to stretch it here.  Some of these will mess you up all sorts, but unless they give you a stroke they’re not causing a brain lesion that would look like a hole to anyone.

Still, kids, just say no to drugs.


#4. Your brain is some other-entity that is separate from you.

I’m going to admit I managed to confuse myself no end trying to figure out what I want to say here, so bear with me.

I’m a hundred percent guilty of this idea of an other-entity brain.  I do it all the time in these very posts, suggesting that your brain is something different from you that does its own thing and occasionally gets up to mischief.  It’s called anthropomorphizing, assigning human characteristics to things that aren’t human. 

Anthropomorphism is an easy white lie that allows us to speak simply about complicated processes.  Your DNA wants to replicate itself, your brain decides things, and so forth.  Assigning wants and needs and decisions to probabilistic biological processes is a completely inaccurate representation of what’s really happening.  And I don’t have a problem with it, so long as it’s recognized as a rhetorical device that simplifies a conversation.  But it’s a problem if you see it as the whole picture.

Anthropomorphizing the brain is especially easy, because while a brain is simultaneously just an organ, a big glob of mush inside a person’s head, it is also the very essence of what makes that person that person.  I want to avoid getting into arguments about a soul and whatnot – that’s not what I mean.  I mean that it is easy to think of your brain as “you”, and also easy to see it as only a “thing”, and that makes talking about it difficult.  And complicating this matter further is that pesky word “mind”, which is somehow different still from a “brain” and falls somewhere else on this “you” versus “thing” spectrum.

The concept of “self” would take me all year and a few hundred pages to tap into, so all I’m going to say is that trying to determine who “you” are is a real bitch no matter how you cut it, and your “self” is an ever-changing, many-headed beast that is exceedingly difficult – if not impossible – to define.  And the brain is a vital facet of it.  A brain is nothing more than a bundle of neurons and synapses and electrical impulses, and it is also the material substrate of the emergent Self.  Just like electrons with their particle and wave properties, both the mundane biological Brain and the lofty cognitive Mind have to be thought of as two aspects of a singular whole, not discrete entities.  It’s an ugly and difficult undertaking.

As humans we need to have agents separate from ourselves to explain certain of our actions, like addictions (“I try so hard to stop but my brain just won’t let me”).  Your brain has to be that cognizant little scapegoat and it has to be something separate from you.  And it really feels that way, too.  You feel like there’s some ugly little demon sitting inside your head telling you to do bad things.  It’s that fabled devil on your shoulder exactly.  You can have that cognitive dissonance and I wouldn’t dream of taking it from you.

Here’s the thing.  Your brain isn’t separate from your mind or yourself – it’s all one big package.  At the same time, your brain is not a conscious entity.  Believing that your brain wants or needs or decides, that’s incorrect.  Your brain is a well-organized chemical soup that operates according to certain biological principals, and from that soup your glorious conscious Self emerges.  So you can trust that when I say your brain wants something, I’m only doing it to simplify a point.


#3. The brain is a muscle.

I don’t know whether people saying this mean “the brain is literally a muscle” or “the brain is like a muscle in that the harder you work it, the bigger/better it gets,” but I’m going to shoot down the entirety of the former and a major assumption of the latter.

First off – and I think this one is obvious – the brain is not literally an actual muscle.  Not in any way.  They’re made of wildly different tissue types and everything.

The second statement, that the brain is like a muscle, contains a critical caveat.  In some ways, the brain is like a muscle, in that you have to use it to keep it strong.  But no matter how hard people try to sell you their guaranteed fitness regimen to get you ready for the Brain Olympics or whatever, the unfortunate fact is that the vast majority of it is total bullshit designed to take your money.

I want to be careful what I say because it’s very important not to throw the baby out with the bath water, and I don’t want you to give up on those crosswords just yet.

Here are some things we know about the benefits of mental exercise.  People with higher education tend to fare better cognitively as they age.  So do people with mentally-challenging occupations.  This could be because those things help people get stronger brains, or it could be because people with strong brains tend to get higher education and mentally-challenging occupations.  Also, training people how to do various mental tricks can have long-lasting beneficial results (that’s the whole definition of learning, right?).

Here are some things we know about the limitations of mental exercise.  Generally speaking, a lot of the things we do to hone our mental abilities – crosswords, puzzles, list-learning, et cetera – just make us very good at tasks like crosswords, puzzles, list-learning, et cetera.  In other words, many mental exercises don’t generalize very well across entire cognitive domains like memory or processing speed. The idea of performing a set of discrete tasks to give yourself a better memory in particular is preposterous, so don’t believe any $50 computer program that promises to train you into having a better memory.  It simply doesn’t work that way.  But that’s not to say exercising your brain is hopeless.

If you want to make your brain into a lean, mean, information-crunching machine, then you need to engage it in a variety of novel tasks.  Go ahead and do that crossword – but also learn a new musical instrument and join a chess club and take up painting and go ride a bike.  The most important thing you should do to exercise your brain is to regularly teach yourself NEW things.  Don’t just get good at the same old things.  Get outside your comfort zone.  I mean that.  Get outside your comfort zone.  NEW and DIFFICULT things help your brain improve.

And keep in mind – none of this is going to make it any easier to remember that new acquaintance’s name at a cocktail party.  If you want to do that, go look up mnemonic tricks for how to remember people’s names and be done with it.


#2. Internet IQ tests give an accurate representation of one’s true IQ.

I’m going to show you a bell curve:

The middle line represents the average (µ) – in this case, the average IQ score.  Each line on either side of that represents one standard deviation (σ) from that average, and is always a set number of points.  The colors represent what percent of the population falls between each of those numbers. 

The standard Intelligence Quotient test (IQ test) is designed to fit on such a bell curve.  Researchers have tested thousands of people and then normalized the scores so that the average score (µ) is 100, and each standard deviation (σ) is 15 points away from that.  This means that 68.2% of the population has an IQ from 85–115, 95.4% of the population has an IQ from 70–130, and 99.7% of the population has an IQ from 55–145. 

To rework that just a little, 99.95% of the population has an IQ less than 145.  Keep that fact in mind.

I’m going to use myself as an example here.  I have an above-average IQ.  I don’t know exactly what it is.  The last time my IQ was tested I was five years old, and IQ tests of five-year-olds are notoriously difficult because they tend to vary a lot depending on the environment and the kid’s energy level, etc.  Before I could get my IQ tested as an adult, I learned how to administer an IQ test and now I’m ruined for IQ tests forever because they depend on me not knowing all the answers and tricks.  Nevertheless, based on other standardized tests I’m confident that I have a moderately above-average IQ.

Every single time I’ve taken an IQ test online (even long before I learned how to give the test), I’ve gotten a score anywhere from 140-170. 

That’s impossible.  For me, I mean.  That score would mean I’m smarter than 99.6% – 99.9998% of the American population.  Guys… I’m pretty smart, but I’m not that freaking smart. 

Put another way, an IQ over 155 (the middle of my internet scores) happens for 1 out of every 8,000 people.  There are less than forty thousand people with an IQ over 155 in the entire United States.  Do I really think I’m as good as the top 40,000 in the entire country?  Definitely not.  (And by the way, if I really had an IQ of 170 I’d be in the top 500, which is just laughably funny – hell, the test frankly starts to break down as a good measure once the numbers get that high.)

I’ve tested some people with IQ’s this high, and they are incredibly brilliant.  Incredibly brilliant.  I’d kill to be that smart.  I’ve also tested people with IQ’s around 80, and they’re also pretty darn smart.  I guess what I’m saying is that if you’ve got an IQ above 100, you should be very proud of yourself.  You’re smarter than half your country.  If you have an IQ above 115, congratulations!  You’re in the top 16% and that is very impressive indeed.

But don’t ever trust a free internet IQ test.  These tests aren’t structured like real IQ tests, they don’t probe even a fraction of the cognitive abilities a real IQ test does, they haven’t been correctly tested against thousands of people to get proper averages and standard deviations, and many are designed to inflate your ego because they want you to come back and click some more.  (I don’t want to say they’re always higher than your real IQ, by the way – they’re just not trustworthy and accurate.)

It’s also worth noting that even a standard IQ test probes a variety of cognitive abilities which in many ways have nothing to do with how well you function in society or as a human being.  IQ tests don’t tell how engaging or charismatic you are, they can’t say if you can run a company, they don’t test your ingenuity or your perseverance or your ability to empathize with your fellow man.  Those traits are all at least as important as your “intelligence”.  Some of the most amazing people I've ever met had an IQ less than 70.  So who really cares how many points you can rack up with the click of a mouse?


#1. Some people are “left-brained” and some people are “right-brained”.

How can I put this simply?  THERE IS NO SUCH THING AS LEFT-BRAINED AND RIGHT-BRAINED. 

Oh sure, I know what people mean when they say that.  They mean that some people (“right-brained” people) have brains that make them very creative and intuitive and free-thinking and whatnot – while other people (“left-brained” people) have brains that are more analytical and logical and objective.  Generally the people saying this are the ones who happily label themselves “right-brained” and despise “left-brained” people for being stiffs.

Before I explain the problem I’m going to repeat myself, because you can’t lose sight of this: THERE IS NO SUCH THING AS LEFT-BRAINED AND RIGHT-BRAINED.

This ill-conceived notion came about in the wake of observations that certain brain functions tend to be lateralized to either the left or the right brain hemisphere.  For instance, your right brain hemisphere receives sensory input from and delivers motor commands to the left side of your body, while your left hemisphere controls the right side of your body.

Also, in most right-handed people, language is supported predominantly by left brain regions.  For lefties like me, on the other hand (pun not intended), language is more often spread across both brain hemispheres or dominant on the right – which makes sense, since we write out our language using our left hands and the left hand is controlled by the right hemisphere.

That said, the vast majority of brain functions don’t fall cleanly into one brain hemisphere or the other.  In fact, even for the ones in which one side of the brain does most of the work under normal operating conditions, if that side gets damaged the other side can usually pick up the slack.  (Practically the only cases in which this doesn’t occur very smoothly are in the above-mentioned sensation, motor control, and language production functions).  It is likely that each of the two hemispheres is processing somewhat unique aspects of the same information, because there are two of them and what’s the point of doing the exact same thing twice when you could be getting more out of what you’ve got?  (Ah, anthropomorphizing is so easy!)

The “X-brained” problem really started when poorly-informed people began making up a whole bunch of junk about all sorts of brain functions supposedly being fully lateralized when really they aren’t.   Then another layer was added when people started pigeonholing these constellations of “lateralized” functions into personality types, even though those “types” are inconsistently described and don’t match known lateralization patterns and have never been substantiated by any actual science (quite the opposite, in fact).  Finally and without any supporting evidence, people started arguing that they were “right-brained” or “left-brained” because they never understood math or because they were artistic geniuses or because they were ridiculously super-geeky, and all because we needed yet another label to attach to ourselves.

Whether we realize it or not, we seem to like calling ourselves names.  “Right–” and “left-brained” are particularly ugly to me, because self-labeling in this way is a blatant unwarranted dismissal of half of one’s potential merits.  It’s like saying “I’m just not good at math.”  I hate that phrase.  I’m not saying it’s not sometimes true.  I’m saying that by stating it, you are giving yourself a bye on having to apply any mathematical effort.  If you call yourself “left-brained”, you’re giving yourself an undeserved escape route off the creative path.  Math is not easy.  Creativity is not easy.  Saying you’ve got a certain type of brain or that you’re just not good at something is fabricating an untrue “unconquerable” biological obstacle that you don’t have a right to give yourself.

So what if you’re not good at math?  All that means is you’ve got to work harder, not that you get to quit.  If you’re not good at sports, or you’ve never been artistic, or you just can’t dance, work harder and actually test the bounds of what you can and can’t do.  There’s a difference between recognizing your limitations and giving up before you’ve even started, and “I’m not very good at X” and “I’ve always been X-brained” are both just ways of artificially limiting yourself.

In short, both “right-brained” and “left-brained” are nothing but meaningless self-deprecating insults, and now you should know better.

21 April 2012

The Top Five Brain Misunderstandings That Will Drive a Neuroscientist to the Brink

I was reminded by a comment on my last post that there are a lot of common misunderstandings out there about the brain.  So here they are, the top five TOTALLY RIDICULOUS things said about the brain that make me want to fly into a homicidal rage when I see them being propagated in popular media:


#5. To grow old is to grow senile.

This one is kind of personal because my research is on human aging.  My grand design when I was fifteen was to cure Alzheimer’s disease, so I’ve been cogitating on it for a while.  The truth is that there is a normal, healthy aging process, and there are a host of separate pathological aging processes that unfortunately tend to get lumped in with healthy aging as What Happens To You When You Get Old. 

Diseases like Alzheimer’s and Parkinson’s aren’t normal.  Dementia is not – I repeat NOT – something that happens to everyone.

This isn’t to say that healthy aging doesn’t involve some declines in certain cognitive functions.  First and foremost, the speed of information processing decreases (meaning older adults are simply slower to do things than they used to be, and we young things must cultivate some patience).  Also, older adults have also been reported to show declines in focused attention, meaning they can be more easily distracted.  Finally, older adults tend to show moderate declines in some aspects of memory – for example in remembering certain names, or where the keys have run off to, or coming up with the right word  in a sentence.  Healthy aging is NOT, however, associated with the profound impairments of memory seen in diseases like Alzheimer’s, which at its worst robs people of the ability even to remember who they are.  See the difference?

Older adults also show improvements with age in other cognitive abilities.  Vocabulary and other crystallized knowledge (i.e. knowledge of facts) both increase throughout the lifespan.  Empathy and the ability to reason emotionally and socially are also far easier for older adults than young adults.

We are a society that fears aging and death, so much so that even words like “old” and “elderly” have acquired a negative connotation.  And misunderstandings like #5 here serve only to help propagate this fearful, antagonistic sentiment toward older people.  News flash – unless you plan to jump off a bridge at age forty, you’re probably going to reach old age someday.  It’s a stage of your life, just like childhood or adolescence or middle age.  And, according to a lot of the people I’ve interviewed at my job, it can be totally awesome if you let it.


#4. You are born with all the brain cells you will ever have.

First off, I just want to tackle this notion of being born with all of anything.  In and of itself, that idea is kind of silly, because I think we all recognize that the version of you that existed a day before the miraculous moment you were born is just about the same as the version of you that existed the day after – minus that whole breathing and eating through your mouth instead of your bellybutton thing.  It’s not like your fetal body is busily building and building right up until you pop out of your mother’s vagina and then all of a sudden you’re in decay mode, you know?

So with that aside, I’m here to tell you that not only are you not done making neurons when you’re born – you’re not done when you hit adolescence, or adulthood, or even that dreaded old age.  You are probably done when you’re dead (but then again, that’s leaving aside the whole philosophical argument about exactly when that other mysterious life-capping moment actually happens and whether your cells still do their thing for a few hours after you’ve reportedly kicked the bucket).

The word we’re looking to investigate here is neurogenesis, or the creation of new neurons.  For a long time we couldn’t find any brain regions that continued to make new neurons on into adulthood, but this was mostly a problem of detection – we didn’t have the right tools to find neurogenesis, so of course we assumed it never happened.  Now that we can properly search for it, neurogenesis is cropping up everywhere – in our friend the hippocampus, the cerebellum, any number of cortical regions – and it appears to be a very important part of continued brain maintenance and function.  (Actually, there’s a really cool story here about hippocampal neurogenesis that was still sort of hand-wave-y last time I checked, but even my two fellow brain researchers fell asleep during the talk we went to about it, so I’ve elected not to share it with you fine people.  Count your blessings.)

Here are some fun facts about your brain growth after birth.  Your head is disproportionately big when you’re born, but it’s not as big as it’s going to be when you grow up (otherwise your mum would be complaining a hell of a lot more than she already is).  That increase in brain size from birth to adulthood comes with a slight increase in the number of neurons you have, but more importantly a large increase in the number of connections they make.  In childhood your neurons shoot out all sorts of projections all over the place, and then during adolescence your brain goes through this massive pruning binge to take out the connections that aren’t doing you any good.  Throughout your teenage years your neurons continue to gain efficiency as they get wrapped in layers of cell membrane called myelin, which is required to allow neurons to effectively propagate their signals.  Therefore, your brain isn’t even fully developed until your early twenties.  You heard me right – your early twenties.  And boys, your brains take a couple years longer to get there than girls’ do.  Explains a lot, doesn’t it?

You can tell this is an old chart because the “cell birth” line (among others) doesn't keep on going out through adulthood.  Also, see how we’ve known for a long time that myelination and synaptic elimination and pruning continue into adulthood.

The upshot is that at this very moment, your brain is still making brand-new neurons.  It’s a very modest amount compared with the total number of neurons in your whole brain, true, but new neurons are still being born.  It’s happening right now.  And it’s going to keep on doing that for a very long time, hopefully.


#3. Smoking copious amounts of pot has no lasting detrimental effect on your brain.

Seriously?  Are you high?  First off, if you’ve ever met a chronic pot smoker, you already know this isn’t true.  You know it.  I shouldn’t even have to back this up with studies, but I’m going to anyway so we never have to have this argument again.

Here’s a very select smattering of the results from papers published since 2010 alone: (A) Chronic pot users perform more poorly on measures of executive function (that means planning, reasoning and decision-making) than non-users, and this effect is worse for those who started before age 16; (B) rats given cannabinoids in adolescence showed reversible impairments on many cognitive tasks but irreversible deficits in short-term memory measures; (C) chronic pot use has been associated with reduced hippocampal volume (although, to be fair, having skimmed that paper I have some concerns about the methods – for a better review of the effects of pot on brain metabolism and structure I encourage you to look at this paper); (D) and luckily, treatment for pot addiction with gabapentin significantly reduced pot use and increased performance on cognitive tests.  Hell, if you want a recent review of cognitive deficits associated with pot use, just read this paper.

People make similar arguments about alcohol, and I’m here to say that large enough quantities alcohol will totally mess with your head over time.  (This one I have some personal experience in!  I swear I used to be quicker than I am now, and I’m blaming vodka-tonics.)  I won’t take you through a lit review here, I’ll just leave it at Korsakoff’s syndrome.  If you think that’s unfair, fine, I’ll tell you all about run-of-the-mill alcoholics’ impairments in perceptual-motor skills, visual-spatial function, learning and memory abilities, abstract reasoning and problem solving.  Don’t make me go there.

Please don’t try and justify your habits by saying they don’t affect you.  It’s a no-brainer (pardon the expression) that excessive amounts of intentionally mind-altering substances will, over time, affect the brain and its function.  I’m sorry, but that’s just the way it is.  Just embrace the fact that you’re willingly damaging yourself and move on.


#2. Pretty much anything movies ever say about brain disorders and treatments.

As a writer, I struggle with convincingly portraying anything about which I know absolutely nothing.  I understand the plight.  But let me give you three of my favorite examples that will let you know just how disastrously misguided most movies and shows (and news outlets) are about the brain:

(A) One episode of Boston Legal starts with William Shatner’s character animatedly yammering away in an MRI machine spouting off baseball statistics for a pair of scientists who are quizzing him while watching “brain activation” blobs flit across a still image of his brain.  The scene cuts to a doctor’s office where the neurologist (?) informs him he has Alzheimer’s disease.

PROBLEMS (well, some of them): You cannot move more than a few millimeters in an MRI scanner or the image will be totally messed up; No one watches brain activation patterns real-time because (a) it’s not feasible and (b) some guy watching a bunch of blobs real-time isn’t remotely as valid as running actual statistics on the data; THIS IS NOT THE WAY YOU DIAGNOSE ALZHEIMER’S and it’s a disgrace to imply that it is; Using an MRI to detect Alzheimer’s is only now becoming a realistic possibility but it hasn’t reached clinicians yet and also people are focused not on brain function but structure; Semantic information (like baseball statistics) is one of the few things that’s preserved in early-to-moderate stages of Alzheimer’s; You would never ever tell someone, “the tests show you have Alzheimer’s disease,” because Alzheimer’s can’t actually be diagnosed until you are dead (while you’re still alive you really have probable Alzheimer’s), and a good doctor would break that news more gently and with a caregiver around to help assimilate what the doctor is telling a probable Alzheimer’s patient.

(B) In my favorite example from House, House has been in a bad situation he can’t remember because he was blitzed out on substances at the time.  He gets his buddy to perform deep brain stimulation to jog his memory.  The first jolt allows him to see a fuzzy, silent, black-and-white still image or two, so he says to crank up the juice – and then suddenly he can see the full memory playing like it’s a proper modern movie.

PROBLEMS: I don’t even know where the start.  The House show always pulls out a single term like deep brain stimulation (DBS) and makes a total mockery of it because apparently not a single writer on that show has ever even taken a single class in medical school.  So to summarize a bare minimum of points – first off, your brain simply does not behave in this way when you electrify it.  When electrically stimulated, you will probably perceive some anomalous sensations, but you will not replay the one memory you’re looking for, and it will not be a 1920’s silent film at low voltage and a modern movie a high voltage.  Also, memories do not ever play back perfectly like that for you anyway, and you all know that.  Finally, DBS would never be used for such a purpose – mostly it’s an extreme therapeutic option for people with Parkinson’s disease, severe depression, and certain other disorders, and no one has a good handle on why it works.

(C) Any movie in which “getting amnesia” means “losing all knowledge of yourself and sense of who you are.”

PROBLEMS: This phenomenon of losing yourself is real, and is called a dissociative fugue.  It is psychological in nature and usually occurs in response to a deeply traumatic situation (not a blow to the head).  It’s also transient, lasting a few days or weeks at most.  Amnesia is a totally different thing, best characterized by the movie Memento.  It can be caused by head injury, surgical resection of temporal lobe areas, oxygen starvation, and a number of other insults.  When a neuroscientist (or even Wikipedia) talks about amnesia, by default they mean anterograde (forward-looking) amnesia, in which patients cannot form new memories (remember HM…?).  These patients often, but not always, also show retrograde (backward-looking) amnesia, meaning that they cannot remember past events.  Amnesia is not transient.  These patients will live their whole lives in never-ending cycles of few-minute increments.  Also, they will never forget facts about themselves like their name and birth date and where they grew up, because this is a different type of memory unaffected by amnesia.  This whole movie problem is really one of semantics – how freaking hard would it have been for that first major movie exec to do a little research and say, “Hey, this isn’t amnesia, it’s a fugue – let’s say this guy has a fugue instead”?  They could have avoided decades of ridiculous misunderstanding! 

And let me just make this clear: You will never – under any circumstances – get hit on the head, lose all memory of yourself and your past, and then regain it miraculously a few weeks later.  Never.

(Update: Dammit, okay, I thought of a circumstance.  If the blow to the head is part of a deeply traumatic situation, you could enter a fugue state - but that's not due to the head bump! My point stands.)


#1. You only use 10% of your brain – imagine how much smarter you would be if you used 100%!

You know what happens when you use 100% of your brain?  EPILEPSY.  Freaking epilepsy.  That’s the definition of epilepsy.  If you want to have a gran mal seizure, go right ahead and “use 100% of your brain” all at once.

This “10%” notion gained credence because when we first started imaging human brains (using methods like MRI), researchers found that the little colored blobs lighting up during performance of any given task covered only a small portion of the total brain.  Why?  Because different parts of our brains do different things.  During verbal tasks, language areas light up; during reasoning tasks, reasoning areas light up… you get my point.

Suggesting that we should find a way to use 100% of our brains is like suggesting we should use 100% of our cars when we drive them.  It would obviously be totally efficient if your car drove forward and backward at the same time in all gears while also honking the horn and wiping the windshield and flashing the lights and blaring every possible radio station and opening and shutting the doors.  That’s clearly the best way to get to your local grocery store.  

Our brains are, in some ways, no more than a conglomeration of specialized little parts which each do their own mental tasks.  At any one time, you “use” about 10% of your brain to—

No, you know what?  I can’t even say that oversimplification in good taste.  This whole “10%” thing is 100% bullshit.  What does that even mean, “using 10% of your brain”?  I dare anyone to quantify with our current methods (A) how much of our (B) brains we (C) use at any one moment.  All three of those points need a better definition before you even start down that road, and anyone who takes the first step on it is asking exactly the wrong question based on a host of incorrect assumptions – for one, that the brain is a singular computational entity operating at X capacity like a damn CPU.  The idea is utter crap on every possible level.

Actually, that CPU bit brings me to an important historical point – brains have always been, by default, equivalent to whatever latest technology is hot at the moment.  In this day and age, we equate it part and parcel with a computer.   


It’s such a handy metaphor, right?  Like they were made for each other.  You know what people said your brain was like in the old days?  A switchboard.

 Extra points to the kids who know what a switchboard is.

We don’t have a good sense for what the brain is really doing, so we associate it with metaphors and then generate silly notions based on what we know about those metaphorical devices.  The point is this: Our brains are well optimized just the way they are, using different bits for different jobs.  Just know that you do NOT want all your neurons to fire all at once.  You will immediately die.

15 April 2012

Déjà vu

Déjà vu is one of the coolest phenomena ever.  I know I already said that about change blindness.  I also said I’d change my mind.

I love déjà vu in the same way that I love sneezes and yawning and blind spots and dreams and migraines.  These things all make me very happy.  They’re small reminders that my brain is still there, it’s organic, it does things I can’t predict.  They point out that we are laughably unaware of the mucky mushy underpinnings of our lofty cognitive musings.  Déjà vu makes us remember we’re only human.

Well, maybe that’s not true for everyone.  Déjà vu means different things to different people.  What it certainly is not is a literal re-experiencing of a moment that happened in that exact same way at some previously unspecified time.  This is an incorrect interpretation of the phrase, because even though the direct translation of “déjà vu” is “already seen,” the definition of the word includes the notion that one is re-seeing something one knows one couldn’t possibly have seen before. 

Now, some say that déjà vu is some special form of extra-sensory perception, or it’s a signal the Matrix has been altered, or it tells us things about our past lives, or it’s some sort of breakdown between all the versions of our lives we’re simultaneously living.  Here’s the thing.  Déjà vu is already a beautiful miracle without making it anything paranormal or supersensory.  It’s a truly incredible process and a delightful experience.  This may sound weird coming from an urban fantasy writer, but I just don’t like the taste of forcing supernatural elements where they don’t belong.

Scientific theories posit numerous explanations for déjà vu, most having to do with the medial temporal lobe (remember the hippocampus…?).  Keep in mind that researchers very rarely have the opportunity to study déjà vu, given how transient and unpredictable it is – in fact, apparently only about 60-70% of people report having ever experienced the phenomenon at all (yet another variable thing I thought was common to everyone!).  So even some of the so-called “scientific research” discussed here tends to wax philosophical.

According to these researchers, déjà vu may occur because:
(1) Some aspect of the current experience excites the brain pathways that produce a sense of familiarity with the event, but not those that support proper recollection of a previous event, creating a disconnect that makes us feel like we know it without being able to pull out exactly when or where we experienced it before.  
(2) Our brains probably store memory in such a way that a small stimulus (a smell, a color) can trigger the incomplete recall of a real but different memory… and in some cases this might give us a sense that the current experience has already been experienced.  (The first part of this is certain – it’s the second that’s up in the air.)
(3) Our two brain hemispheres might sometimes get slightly out of sync when processing an input, such that one side gets that direct input fractions of a second earlier than usual and therefore misinterprets the added information from the other half of the brain as a repeat of an already-experienced memory. 
(4) We “experience” many types of things in media like books and movies, which allow us to feel strong familiarity for things we’ve never actually experienced in real life – and when we see it in real life for the first time we might accidentally think we’ve already seen it. 
Alternatively,
(5) Some researchers believe that precognitive dreams (i.e. dreams which predict future events) may create a sense of déjà vu later on when they are properly experienced.  I’ll tackle this one shortly.  
(6) And lastly – and this is the least controversial of the theories because it’s the most testable – déjà vu can occur as a result of an epileptic event, like a seizure, in the medial temporal lobe. 

(There are plenty of other theories I’ve decided to let you discover on your own, seeing how long that paragraph has become already.)

I like aspects of a lot of these, but I want to put my money down on the first and last – the disembodied familiarity thing and the seizure thing.

There’s a lot of evidence that one’s concrete knowledge of a previously-experienced event (call it recollection) and one’s comparatively vague sense of familiarity with an event are different things that are processed differently by different brain regions – recollection by the hippocampus, and familiarity by… well, parahippocampal and/or perirhinal cortex, depending on who you talk to (they’re both structures basically adjacent to the hippocampus).  In the rare déjà vu experience, it’s possible that something about the current environment differentially stimulates the familiarity and recognition brain structures, creating a detached sense of familiarity.

Notice that in the previous sentence, I said it was something about the external environment causing the brain activation.  But it’s also possible that your brain just does this stuff to itself, without any outside help.  For example, people with temporal lobe epilepsy sometimes report feeling déjà vu right before a seizure strikes.  But you don’t have to have epilepsy to have epileptiform brain activity, and in fact every single person on the planet has endured some level of seizure-like activity in his or her brain.  Basically, every once in a while some tiny group of neurons goes a little haywire and activates for no good reason, but it’s natural and nothing to worry about.  Mostly these events don’t impact our conscious lives at all.  But maybe, sometimes these events occur in just the right place at the right time, activating our familiarity structures out of the blue, and suddenly the whole world around us feels like we’ve done it before.

Regardless of whether it’s externally or internally generated, it makes sense that déjà vu is an innocent brain mistake which makes us feel something that’s not really real.  It helps explain why we sometimes feel recursive déjà vu – the sense that we’ve even had this particular sense of déjà vu before, and that we’ve had a déjà vu of that déjà vu of a déjà vu, and so forth.  That’s just our brain accidentally and repeatedly triggering a feeling that this event has occurred before when it hasn’t.  So I’m pretty darn confident that when you experience déjà vu, that exact experience has never happened to you before – no matter how much you want to believe that.  That want, that need – that’s just your brain talking.

Which brings me to precognitive dreams.  I will certainly insult people with my opinion about this, but I’m willing to take that hit and say that the ability to actually foresee future events in a dream is literally impossible.  Let me rephrase that so I can be totally clear – precognitive dreams cannot be the true experience of a real-life event before it happens.

There are just too many problems with the idea that dreams can be pre-plays of real events (not least the violation of causality).  I’ll name a small few.  (1) The vast majority of things that happen to us happen repeatedly, so it’s practically impossible to avoid dreaming up scenarios which will be similar to later life events; also, “similar” is not at all the same as “identical”.  (2) If you compare every dream that you’ve ever dreamed with every event that has ever happened to you, you will absolutely come up with matches, and it has nothing to do with foreseeing anything.  (3) Our brains can make us feel conviction about things we actually can’t remember very well, so when those similar real events happen we can be duped into accidentally overwriting our dreams to match the events (someday I’ll write a post about this point).

Okay, enough of that.  I don’t want to give the impression I don’t believe dreams can be predictive.  Brains are prediction machines.  Especially human ones.  It’s arguably what we do best.  So it’s totally reasonable that your brain makes very, very good predictions about the future while you’re dreaming, using information you might not consciously piece together while going about your daily.  I am a happy believer when someone tells me that they always dream of a white elephant before someone dies – so long as they also tell me the white elephant is their brain’s way of assimilating a host of (subconscious) clues indicating those other people were about to die.  Such a dream would be entirely plausible, and maybe even probable.

What I’m saying is that déjà vu serves as a reminder that our brains are doing a lot of things behind the scenes.  In fact we don’t have conscious access to the majority of the things our brains do.  (Go ahead, try and stop your heart just by thinking it.)  When magical things like déjà vu and prescient dreams happen to us, we can congratulate our brains for being so gosh-darn brilliant without us even knowing it.  They really are capable of miraculous feats.


P.S. I got really sick of seeing the phrase “déjà vu all over again” in article titles as I looked all this up.  I used to love saying that and now it’s tainted for me forever.  So sad…

01 April 2012

The object of your dreams

Every single one of my writing ideas has surfaced in a dream.  The making of Canine, for instance, started with a dream about my departed dog – well, about a sentient dog-creature which I identified in the dream as both my sweet girl but also a male wolf-dog.  You know how dreams go.

I had occasion recently to question the nature of dreams – specifically, dream protagonists.  And I was shocked to find that not everyone dreams the same way about dream protagonists.

I want you to think about all the dreams you’ve had – recently or over your life, I don’t care.  Actually, it would be interesting to consider whether your dreams have evolved over time, too.  But I want you to think about the main characters in those dreams, and to assess the following three qualities (please keep in mind that it is feasible to have all possible combinations of these qualities as you consider them):

1. Identity.  Who is the protagonist in your dreams?  Is that person yourself, or someone else?  In other words, what does your protagonist look like?  (If you answer, “Well, it’s like a version of me that does things I would never or can’t do,” then for this purpose, your answer is “yourself”.) 

2. Agency.  Are you the protagonist, or are you more like a camera following someone else?  Do you identify that protagonist as yourself, no matter what they look like?

3. Perspective.  Are you seeing things from the protagonist’s perspective (first-person) or are you watching the protagonist from the outside (third-person)? 

I ask you these questions because I was really surprised to hear the answers from my friends.  Many people, when I’ve queried them, have said that (1) their real self is the only protagonist they dream about, (2) they always identify as the protagonist, and (3) because dreams will be dreams, they see it in first person or in some combination of first and third person.

I thought this was CRAZY. 

Like seriously crazy.  Since I can remember, I’ve been dreaming either as myself or as other people or animals or characters (I told you about my dream where I was Raphael from the Ninja Turtles, and in the first dream I remember I was a deer, and I’ve been my stories’ protagonists and strangers I don’t recognize).  Generally when I dream as a woman, I’m myself – but even that changed a couple weeks ago when I finally had a dream as another woman.  Most of the time I identify as the protagonist, but I’ve had dreams where I’m just following someone else around, like I’m the camera.  And lastly, I’m pretty consistent with everyone else in that I have dreamed in the first, first-and-third, or third person (it’s ever-changing within single dreams, usually).  I still have a hard time believing my friends have never had a dream like that.

The point it really hammered home for me is that my experience is not everyone else’s experience.  Of course I knew this, but not in so concrete a way – and it took me 26 years to figure out that not everyone has dreams in which they are a different sex or species.  I wonder now how many other assumptions I’m making about so-called “human experience” that are just my personal idiosyncrasies.  It’s really kind of concerning.


Please fill up my comment field with your answers to my above three questions, because I would really, really like some more data about this.  You may answer in proportions – e.g. you dream as yourself 80% of the time but someone else 20% of the time, etc.  Thank you!