27 January 2015

3 things I wish I’d known about sexuality and identity

The transition from childhood to adulthood can be a confusing time.  (Do I win some kind of award for understatement of the year for that?)  Sexuality and gender identity can be especially hard to understand and deal with if a kid suspects their answer isn’t totally aligned with the norm.  This stuff can involve a lot of questioning and soul-searching, and it’s hard to know what to ask or how or to whom when you’re only just emerging into a world that’s willing to give you the proper framework for it rather than tell you you’re too young to be thinking about that stuff yet.  You can end up saying a lot of stupid things to a lot of people you care about along the way, and however genuine and necessary those statements and questions are, years or even decades later you might look back on those conversations and know that the other someone still knows you said that, and you will wish you could call them and explain yourself now that you understand what you really meant.  (But it’s not like I’m speaking from experience or anything, here.)

We’re getting better about helping kids through this stuff now (I hope).  Acceptance of homosexuality is on the rise, the word “transgender” appeared for the first time ever in the President’s State of the Union address, Gamergate was a perfect demonstration of how strong our movement for women’s equality has become (and how far we still have to go).  So maybe this post is a little old-fashioned now – but as my kids grow up, I fully intend to make sure these adult messages are getting passed along to them in a way they can understand.  Here are a few of the things I wish I’d known around, say, the age of twelve, in the interest of avoiding some of those acutely embarrassing memories I still harbor:


1. Gender is a concept, and it’s complicated

Here is a fact which bears repeating because of how confusing it is for most people when they first encounter it: Sex, Gender, and Sexuality are three totally distinct things.  Your sex is (more or less) the set of genitalia you were born with.  Your gender is (more or less) what’s in your head – your feeling about what type of person you are.  And your sexuality is, to put it simply, what you’re into. 

The idea of gender being in your head is especially hard for people.  It’s infinitely depressing as I start down this path of raising a child to hear how early and often people ascribe masculine or feminine personality traits to their children based on their sex.  It’s so deeply rooted in our culture to have only two genders and to have those be defined entirely by genitalia that many people wouldn’t dream of an alternative.  And it’s destructive even for cis-gendered kids – kids whose gender identity does happen to coincide with what’s in their pants.  It makes young boys feel they can’t express their emotions through tears.  It makes young girls feel they have to love pink.  It creates all our favorite damaging stereotypes about adult men and women.  So imagine what it does to kids who don’t feel as canonically masculine or feminine as their genitals dictate (and, if we’re being honest, that’s a lot of us). 

For me, the biggest revelation I came upon way too late was the notion of male privilege.  It was never presented to me explicitly how fundamentally differently society treats men and women.  I saw it all the time, of course, but I lived it, too – and when you live it, it’s hard to see just how pervasive and persuasive it is.  So at the time, it was hard to see that for me (and I stress that strongly, here), a yearning for male privilege was a large part of my sometimes-expressed wish I’d been born a guy.  Certainly not all of it – but a huge, undeniable part.*  I didn’t understand at the time that gender is nothing but a construct, both a social and a personal construct – and that because it is social in addition to being personal, it’s very easy for outside forces to influence a person’s thoughts about what they’re really feeling.

What I would say to Young Me now: “You may think your decisions about gender are entirely your own, but they’re not.  They can’t be, because you’re a social creature and you belong to humanity.  You don’t exist in a vacuum.  The best you can do is think long and hard about it, quiet your soul, and ask yourself: how you feel, how that feeling impacts your daily life, and whether you need to make an external change to reflect your inner self.”

*This is such a critical point that I don’t want it to get lost: This was my experience, and it wasn’t all of my experience.  Everyone has a different journey and my road, thank God, has been a relatively easy one.  I applaud the courage of those whose battle with traditional gender roles is far more personal than mine.


2. Neither your gender identity nor your sexual orientation has to be set in stone

I’ve said before that our society (our species?) puts a lot of stock in labeling.  We want you to tell us you’re [insert-label-here], and we want you to stay in that bucket.  Jumping buckets just might be some kind of sin.

In high school I heard a self-identified lesbian spewing all sorts of hatred for a girl who had previously declared herself gay and then dated a guy.  I sat quietly by and let this person rant.  And I thought, “Wow, good thing I’m straight and don’t have to worry about this!”  Ha!  There are so many things wrong with that memory.  This person’s hate for someone who changed their mind – or didn’t change their mind at all but didn’t feel the need to tell an outside person all the minute details of her inner desires!  The assumption on my part that I was straight despite all the still-building evidence to the contrary.  My fear of other people’s opinions about my sexuality.  My belief that a person could only ever be one thing and any “deviations” along the way were nothing more than an attempt to figure out what that one thing was.  My inaction in the face of her hate.

It may not be the most relevant use of this quote, but whenever I think about these kinds of things I’m reminded of Maya Angelou: “I did then what I knew how to do. Now that I know better, I do better.” 

The more I’ve learned about gender and sexuality, the more comfortable I am in saying simply, I like people.  I can’t tell you what kinds, really, I just know it when I see it.  Or get to know it.  Or hear it from across a crowded room.  And tomorrow I might not feel the same way.  And it’s none of your damn business anyway.  And your interests and self-identity are none of my or anyone else’s damn business, either.

What I would say to Young Me now:  “If it helps you to think of your gender and sexuality in terms of labels, then by all means do it – but don’t feel you have to hold onto those labels forever.  Wear them while they suit you.  And if anyone tries to push you into donning an outgrown or ill-fitting coat, push back.”


3.  It is NEVER okay to belittle someone else’s experience

I’m ashamed to admit I’ve repeatedly had to learn this one the hard way.  As a simple and obvious example, even into high school I used to call all sorts of things ‘gay’.   I said it even though I had an amazing friend who early and often yelled at me for it.  And I’m still learning how to see the world from other people’s eyes.  Even today I caught myself wanting to defend my hometown as I read about someone else’s awful experience in it.  #NotAllTucsonans, style of thing.  It was pathetic.

Acceptance comes up often in discussions of sexuality and identity.  We all view the world through our singular experience, and by definition that makes it difficult to get into the mindset of another person.  (Neuroscience tells us we get better at this as we get older, thank goodness!)  We’re called upon again and again simply to trust that someone else is sincere when they tell us they like people who identify as the same gender, when they say they’ve never felt comfortable in the bodies they were born with, when they carefully explain why the word ‘gay’, when used pejoratively, is offensive to them.  And the moment we persist in arguing they must be wrong or they shouldn’t be such a baby about things, we invalidate them and their hard-earned sensibilities.  We’re saying our stupid comment matters more to us than the reality of their everyday experience.  We’re letting our singular, myopic view of the world dominate the dialogue.  Wouldn’t it be nicer and easier all around to just give people the benefit of the doubt and assume they’re intelligent people with the ability to decide for themselves who they feel like and whom they like?  Wouldn’t it be better to embrace their experience as another shining example of the vast spectrum of human individuality?

What I would say to Young Me now: “You will make mistakes and offend other people.  That’s life.  So when someone tells you they’re offended by your words, take the time to figure out their side.  And if you still insist on saying what you’re saying, know that you’ve just given your comment priority over someone else’s feelings.”


I hope as my kids grow up and figure these things out for themselves that I can direct them toward information like this to help them build attitudes of acceptance and self-assurance.  I know as their mom I’m more or less a background voice to the tapestry of friends and classmates and media outlets that will no doubt contribute to their worldviews far more than I will.  But maybe, just maybe, I can at least give them a little bit of a leg up.


20 January 2015

4 Things That Aren’t True About Anxiety

You may be able to tell from my earlier posts that I suffer from the occasional panic attack or two.  Life often feels like a balancing act: caffeine intake, exercise, sleep, food choices, movie selections, goals accomplished… you get the recipe wrong, and you’re asking for an attack.

But my experience isn’t everyone’s.  I read so many lists of things that “every” person with anxiety supposedly experiences, that “every” friend should know if they want to help… and sure, some of it rings true, but so much of it has nothing to do with me and it drives me crazy to be defined by others’ too-broad lists.  It’s a symptom of a greater evil – a desire to classify and cure all forms of mental illness, to compartmentalize and, intentionally or accidentally, to marginalize.  So here’s a list of what, to me, is NOT true about anxiety.


Myth #1: The symptoms of anxiety are common to all anxious people

Anxiety runs the gamut from a mild difficulty around which one can still function to a crippling daily bombardment of terror.  It can come suddenly out of nowhere or predictably in certain circumstances, and it can be about any number of subjects or even nothing at all. 

For years I convinced myself I didn’t really have an anxiety problem because I couldn’t easily pigeonhole it into one of the classic anxiety disorders defined by the Diagnostic and Statistical Manual of Mental Disorders.  My attacks weren’t about nothing, they were always about death.  I could trigger them myself if I thought about death long enough.  They didn’t happen very often or over a specific period of time.  And not being able to define my problem in the context of an external diagnostic system made me feel like a phony.  I didn’t have anxiety and I didn’t deserve to complain about it.  I was just being silly.  And that’s a bullshit thing for a teenager to have to feel when they’re sitting there in the shower bawling their eyes out because one day they will, inevitably, die. 

This is a truism that applies to so many different aspects of life and self-identification: External sources don’t know your experience.  They can’t.  If you’ve got anxiety, you’ve got anxiety and no manual is going to justify or deny that for you.  What those manuals can do is help inform you of the wide variety of methods you can use to help yourself out of whatever you’re experiencing.  Which brings us to another myth…


Myth #2: There is one best way to help all people who experience anxiety

The other night I told my husband we couldn’t watch the Nostalgia Critic review of the movie Casper.  I didn’t have to say why.  He knows me well enough by now to leave it alone and watch it on his own time.  So tell me – exactly how many other people with anxiety would this no-watching-Casper treatment apply to?

I hope it’s obvious that a mental hardship (not necessarily a disease, a disorder, or even a problem) with symptoms as broad as anxiety would also have a wide range of legitimate treatment options.  And the most important part should go without saying, that the person with the anxiety is the one with the final word on what they will and will not allow for treatment and help. 

This is something we often miss with mental hardship.  It’s your brain that’s “broken”, so it must be a reasonable assumption that a broken thing can’t be trusted to fix itself.  And in certain extreme cases, that may be true.  But it’s a rare case of anxiety that doesn’t come and go.  A person not currently in the throes of anxious grief is certainly capable of explaining how they want to be handled when they are having a problem.  The phrase, “What can I do to help?” is a powerful one.  Use it.  As genuine and good a place as you’re probably coming from, you can really, really mess with an anxious person in the middle of an attack by doing and saying the wrong things.

But if I can presume, a word of caution to fellow anxious folks: you know yourself best, but please be open to others’ insights.  The best help I got was from a counselor I was almost too proud even to go see for the two sessions I really needed him.  I thought I could take care of it myself and no one could tell me anything I hadn’t already considered.  I knew in my bones the book was closed on the afterlife, and the inevitability of my cessation was crushing.  And all the counselor had to say was, “I know you think you know, I know it feels concrete to you, but what if you can tell yourself, logically, it’s impossible to know?  What if you can force that room for doubt?”  At the time I smiled and nodded and thought to myself that was a nice load of crap.  Now I tell myself this every single time the panic looms.  It still feels like a lie and it probably always will, but humbly I admit I don’t own the answer, and it helps.  Help can come out of anywhere.


Myth #3: When people are anxious, it shows

That depends.  For the big stuff, probably you’ll see some classic symptoms.  Maybe it involves panic attacks, maybe it’s physically apparent – but maybe it’s not.  To beat to death a tired cliché… that whole iceberg thing.  Ten percent above water and all that.  You know it.

I don’t like getting people all riled up about my anxiety.  It’s tiresome.  So unless I really need a hug, I’m likely not to mention how long it takes me to calm down enough to get to sleep some nights.  If someone never looks anxious, always seems bubbly and happy, doesn’t seem to have a care in the world… it’s still not reasonable to assume they never have a moment of oh-my-God-I’m-going-to-die-right-now pants-soiling fear for no reason.  You just never know.  And it doesn’t make their troubles any less real or worthy of regard if they do open up to you about it.


Myth #4: Everyone who has anxiety wants to be fixed

Confession: I do.  I really do.  But not with drugs, and not at the expense of anxiety’s benefits – that high-strung, “dancing on the edge of something awesome” kind of feeling.  I don’t believe it’s possible to simply excise the part of me that breaks down in terror at the thought of death while leaving the rest of me intact.  It’s a lament I hear often with various forms of “mental illness”: No treatment, please, if it means any kind of loss of “self”. 

That’s the real bitch about it.  Much as we may want to, we can’t ignore that the problem is in our brains and our brains are ourselves and our selves have been built with anxiety (or whatever) as a fundamental component.  So sure, I’d love for mental illness to be taken as seriously by both the medical community and society at large as a tumor or diabetes is – but I’m tired of the concomitant assumption that viewing it in that light means it needs to be treated with the same sterile hand.  It’s been my experience (and I’m positive I’m not alone) that like it or not, my brain is myself and altering that means altering me, which seriously limits the options on “fixing”.  I also know dozens of people who feel that life is much better, and they feel more like themselves, when they can manage their symptoms with a pill.  Both are fully understandable, legitimate approaches to dealing with mental illness. 

Understanding goes a long way toward helping someone with their anxiety.  If you see a problem in someone else, maybe ask them about it and see if they’d like you to do anything or even just be there for them.  If you see it in yourself, find someone to talk to who will listen.  That last bit’s critical.  And most importantly, don’t let this post or any other external source make you think your problems aren’t worth tackling.


26 May 2013

Something to Leave Out of Your Carry-On Luggage

I had the most bizarre experience in the airport security line today.

The security guard scanning the carry-on luggage asked to run my laptop bag through the machine a second time, which I thought was a little odd – there was almost nothing in the bag after I took my laptop out.  The second time through, she stopped the bag and hailed another security guard to search it.  She pointed at the screen to show him what to look for, and he nodded and brought the bag over to me.

By this time I was thoroughly confused.  The guard asked if I had anything sharp in the bag that he could cut himself on and I told him no, not to my knowledge – and what a weird question, right?  I’ve never been asked that when my bag’s been searched, before.  He searched through all the main pockets and came up totally empty, as expected, and then went through them again for good measure.  I was getting a little annoyed at his persistent searching. 

But then he turned the bag over, opened the back pocket, and pulled out a 10-inch butcher knife!

Let me tell you, a giant unfamiliar-looking knife is not something you want to see come out of your laptop bag at the airport.  I think my heart stopped.  The guard looked at me like I was crazy and then showed the knife to the first guard and then to another guard and asked if I needed to be brought in for further questioning, and all the while I was blathering that I didn’t know where it came from or how it got in there, and absolutely yes, please go ahead and confiscate it.  The third guard assured me I didn’t need to be interrogated but of course they’d be taking the knife away from me, and to my infinite relief they let me go on my way.

It took me about a half hour of racking my brain after that to piece together a story about the knife.  It looked kind of like a knife from work in Arizona, and it’s possible I put that knife in my laptop bag to keep from stabbing someone on the way to cutting a birthday cake in the lab.  And then I totally forgot about it for many, many months up to and including the moment it got pulled out of my bag by the very worst possible discoverer.  I still don’t particularly recall having done this, but it sounds plausible and almost familiar.

I just want to point out that this means I’ve taken that knife with me on at least one other flight before this one, if not a few more.  I find it a little concerning that it hasn’t been found before now.  But if I could I’d give those guards today a raise.

06 October 2012

Things Short-haired People Don’t Understand

This should really be titled, “Husband – JUST READ THIS AND QUIT NAGGING ME.”  But I figured I’d write for a more general audience, so here you go – things you people who’ve never had long hair need to know about my lifestyle, because sometimes it just doesn’t seem to get through.

1.  I need more shampoo than you.

Your hair is like ONE INCH long.  At most.  In some places it’s shorter than your eyebrow hair.  You don’t shampoo your eyebrows, do you?  Why should you need to shampoo that part of your scalp at all?  But even so, for the sake of this discussion let’s assume your hair is one inch long, and mine is ten inches long.  I will need to use – you guessed it – TEN TIMES AS MUCH shampoo as you to get the same amount of coverage on my hair.  So don’t be telling me I’m using too much shampoo.  I’m using twice as much shampoo as you.  Three times at most.  So really, YOU are the one using too much shampoo. 

2.  I should not shampoo every day.

The thing about short hair is that it has not been on your head long.  ALL of your hair on your entire head has been there for less time than the bit of hair that’s two inches away from my head.  I did extensive, thorough research on this subject (thank you, Answers.com) and have determined that hair grows at a rate of about half an inch per month.  That means ALL YOUR HAIR has been around a maximum of, say, three or four months.  Mine?  These ends have been with me for two YEARS.  So while you can happily destroy your hair by shampooing away replenishing scalp oils every single day because you’re not even going to see that hair half a year from now, I NEED that oil to maintain this mane I hope will still be treating me right two years down the road.  I don’t want some kind of split-end mutiny on my hands.  Do you even know what split ends are?  Can you get a split end in three months?

This applies to hair dye too, by the way.  When I go to dye my hair, I’ve got to worry about how it’s going to affect my look two years down the road.  How old am I going to be?  What will I be doing with my life?  Will this affect any future job I could try to get?  Because some colors are easier to dye over than others.  I’m just lucky I can go back to brown whenever I have to – I salute all you brave blondes out there.

3. Your bad haircut is as nothing compared to my bad haircut.

Again, if you get a bad haircut, the absolute longest you have to worry about it is three months.  And I’m pretty sure in two weeks it’s going to settle in just fine and you can get it fixed, no problem.  You can move on with your life.  Two weeks isn’t even long enough to really notice the roots under my dye job.  When my hairstylist messes up, she chops off three extra INCHES, not millimeters – and we’ve already discussed that it can take months to recover from that kind of error. 

Now, you may be saying, “But you still have a lot of hair to keep cutting and get the shape right.”  NO.  If I wanted to take another three inches off my hair, I would have done it the FIRST time.  Now I have to wait another SIX MONTHS to get it even to the point where it SHOULD HAVE BEEN WHEN I WENT IN.  That’s HALF A FREAKING YEAR.  If I want to cut more off and reshape it, I basically have to resign myself to an entirely different hairstyle and look.  Maybe I don’t even have the right clothes or earrings to pull off that mop.  I could have to invest in a whole new wardrobe.  So don’t tell me your awful haircut is worse than mine.

4.  My hair takes a lot longer than yours to get pretty every day.

If I want to do my hair and make it look actually pretty, it takes me an hour.  I have to do it in layers, one row at a time, getting each section right before I move on to the next part.  This is a complicated work of art I’m sculpting, here.  I’ve watched you short-haired people “doing your hair.”  It takes like ten minutes.  It doesn’t even involve any kind of iron.  So if I say I need to get ready to go out somewhere, you can assume that I need to get my hair done, and that’s going to add an hour to whatever time you were estimating for yourself.  And that’s assuming you’re also doing your makeup like I am.  No?  No makeup?  Add another half hour.  Being beautiful takes WORK, bitch.

5.  A ponytail is a legitimate hairstyle.

I don’t want to take an hour out of every single day to get my hair looking gorgeous.  You’ll be lucky if you get that once a week.  Once a week for me is about equivalent to all the time you’ve racked up over the week doing your hair daily, anyway.  If I don’t do my hair up nice, though, it’s utterly hideous because I also have curly hair (and that is just a whole other rant for later).  It’s not only ugly, it gets in my way.  I can put up with it getting in my way if it’s pretty, but if it’s going to be hideous too then that is just unacceptable.  So if I shove my hair up in a ponytail all day long, DO NOT make fun of me and my childish-looking hairstyle.  It’s convenient and comfortable.  End of discussion.

I hope you’ve learned something.

24 July 2012

Is it a healthy sense of caution if you’re constantly envisioning your own death?

I’m on a plane over New Mexico right now (well, not right now right now, when I’m posting this or you’re reading this.  I mean maybe I am.  It’s just a highly unlikely coincidence.)

I’ve been saying to a lot of people lately that I’m not afraid of flying, and I see now that I was so, so very wrong about that.  I thought I was telling the truth.  But sitting on this plane right now, I don’t actually think I’ve gone a whole minute without being fully aware of a pervasive sense that I’m stuck in a poorly ventilated tin can death trap.

Let me give you (future me who’s reading this and trying to convince herself she’s really not afraid of flying) a few examples that I’ve come to realize do not connote a healthy level of fear:

1. As I sent that last-minute text to my husband before I had to turn the phone off on the tarmac, I wondered whether he would think to post my message to my friends on FB when I died so they could know the last sweet sentiment I said to anyone I loved.

2. I’ve repeatedly cycled through all my dozens of plane crash stories, trying to figure out which one best applies to my current flying environment and whether I’d die if any one of a wide variety of malfunction or human-error scenarios occurs.

3. When we lifted off I was looking out the window watching the city get smaller and smaller, and with every miniscule lag in acceleration (typical of even a successful takeoff), I was Zen-preparing myself to watch that ground start to tilt and get bigger again.

4. I was pretty convinced that the drawn-out grinding sound I heard on the ascent was an engine failing.

5. I practically ran back from the bathroom because there was a small jolt of turbulence and I needed to get back to the safety of my seatbelt before a panel ripped off the plane and I got sucked out the hole like that one lady did in that one Cracked article I read that one time.

6. When we landed on my first flight we turned into the airport at an angle, and all I could imagine was the plane barrel-rolling out of control and plummeting into the earth.

7. Whenever we went into a cloud I was ready for the moment another unseen plane collided headlong with ours, and I couldn’t decide just how likely I was to even know what hit me in the fractions of a second it’d take for me to get crushed or exploded to death. (I mean in a head-on collision, our plane and the other plane would each probably be going ~500 mph for an effective speed of ~1000 mph, or 450 m/s, and if our plane was in the neighborhood of 100m long, then at row 25 I’d be dead in about a tenth of a second and it’s arguable whether all of that sensory information could manifest a conscious acknowledgement in that time, although I have a sinking feeling I might get to enjoy a few milliseconds of perfect imminent-death awareness. P.S. that is why you learn algebra, my friends.)

I know that air travel is safe.   I know this.  I know that even if problems occur I’m likely to make it out just peachy.  But none of that matters when you’re dealing with a phobia.  Talking yourself out of a death phobia is pretty useless.

And I still fly.  Regularly, even.  At the beach I still swim out into water that’s probably deep enough to hold great white sharks and I inadvertently do my best injured seal impression trying to stay afloat.  I’m totally willing to drive on the Lake Ponchartrain Causeway even though I’m pretty sure the bridge is going to collapse and I’m going to survive both the impact and the threat of drowning only to be shredded alive by a pack of ravenous alligators.  I sometimes even lean against railings on high balconies, although that just seems foolhardy when I can get the same view just standing near the edge rather than risking death-by-shoddy-railing-craftsmanship.

It’s just I feel nauseous every single time I get on a plane.


02 July 2012

The difference between Necessary and Sufficient, or, Why your emoticons should not have noses

In this crazy new cyber-world we’re living in, the entire rich array of human emotional facial expressions is being reduced to nothing more than a select few humble punctuation marks grouped together to look like caveman scratchings turned on their side.  In social media conversations, these so-called “emoticons” (also called “smilies”, for those of you not hip enough to be up on your “cyber-lingo”) have assumed the vital role normally played by our naturally expressive faces, becoming the sole representation of our emotions toward the people with whom we interact.  This is distressing in and of itself, but it’s not the point of my discussion today.

Correctly typed, the most common standard emoticons consist of virtual “eyes” and a virtual “mouth”, made using punctuation marks.  The simplest of these is the basic colon-plus-end-parenthesis – :) – though many other variations exist:  ;)  :D  :(  :’(

But a deeply bothersome trend has managed to grow and fester deep in the bowels of the emoticon world: the dash-nose.  This hideous abomination has wormed its way into all the great emoticons, a defilement I’ve never abided graciously:  :-)  ;-)  :-D  :-(  :’-(

And today, I finally figured out why that nose bothers me so much.

Humans have a very limited range of physical features they like to monitor during social interactions.  When we see another human face, we attend most to the eyes and mouth because these are the expressive features that move and tell us how we’re supposed to respond to their owner.  But a nose?  No one cares what a nose does.  A nose stays pretty much the same no matter what we’re doing, and outside of augmenting a very select few emotional expressions (e.g. the scrunch of disgust, the flaring nostrils of fuming rage), our noses are practically pointless.

Which brings me to Necessary and Sufficient.  These terms are regularly used in the sciences to describe two unique aspects of how important a certain factor is in creating a given outcome.  A factor that is necessary must be present to produce an outcome, while a factor that’s sufficient is all that’s required to produce that outcome.  So if it’s necessary, you absolutely have to have it, and if it’s sufficient then it’s all you actually need.  (And it is possible for a thing to be both necessary and sufficient – or neither.)

These are both readily testable properties.  To determine if something is necessary for a certain outcome, you just remove it and see if you obliterate the outcome.  To determine if something is sufficient to produce an outcome, you remove everything else and leave only it, and see if the outcome remains the same. 

For example, removing a necessary facial feature will prevent you from recognizing an emotional expression (like a smile), while leaving only a sufficient facial feature present will still allow you to recognize that expression.

Let me demonstrate on myself.  Say hello to me:

Howdy!

I hope you were gracious enough to at least offer a greeting.  I mean look at that big ol’ toothy grin.  That is a smile.  How could you ignore that kind of smile?  And how can you tell it’s a smile?  Well, the corners of the mouth are turned way up, the eyes are happily scrunched, and the nose… yeah, it’s not doing much. 

Now, let’s look at what happens when I take the liberty of altering each of these three facial features (mouth, eyes, and nose) independently. 

Let’s start with Necessary.  Is any of these three features necessary for you to be able to tell that I’m grinning at you?


The truth is, no.  As long as you have any combination of the other two features (eyes and nose, mouth and nose, eyes and mouth), you can tell I’m meant to be smiling at you.  That said, the third smile with both eyes and a mouth present is definitely the most informative of the three faces, in that it looks the most like it’s smiling.  This suggests that the nose is the least necessary component of the smile.

So how about Sufficient?  Would any of these features alone be enough for you to tell I’m still smiling?


Well, how about that?  My mouth and eyes are each sufficient, but my nose does absolutely nothing toward helping you figure out if I’m smiling.  In fact, if that nose picture still looks like I might be smiling at you, it’s only because I didn’t go and doctor the dimple out of that freakishly sculpted right cheek so you’re still getting the impression of a mouth-smile.

So what does this tell you about your use of :-) and :-( and ;-) ? 

It says that the only thing the nose-dash is doing is making you take longer to generate your virtual expression, and making others take longer to observe and evaluate it.  The extra dash adds nothing at all of value.  In fact, if you were to do my same necessary/sufficient experiment with an emoticon, you’d find that BOTH the eyes and mouth are necessary to convey information, but the nose is neither necessary nor sufficient for anything – see how it’s the exact same dash for every emoticon you type? 

The emoticon nose is, in short, a waste of a character.  This could have a profound impact on the quality of your tweets, people.  Think about that the next time you write another, “LOL :-D !!!!1!1!”

24 June 2012

I went up the mountain to kill a skunk

Tucson, Arizona is a small city nestled in a gorgeous desert ringed by mountains, remnants of an old volcano, and the city lights are stunning on stormy nights like this one.  So tonight, on my way home from visiting with friends, I decided to embrace my childhood and geological heritage and head up Catalina Highway into the mountains to marvel at the nighttime view from the Babad Do’ag lookout point up at mile marker three.

Everything was going so well at first.  I had all the windows down and the music on but (for once in my life) turned low, and there was a storm brewing to the south and I could just catch flashes of lightning off in the distance beyond the city as I navigated the winding mountain road up toward the lookout point.

I was going to stop at Babad Do’ag, like I said, except that for some reason red and blue lights were flashing as I approached and I saw a couple of cop cars stopped at the lookout, and I decided that for the sake of my own tranquility and enjoyment I would just move on and find a better spot higher up on the mountain to stop and revel in the beauty of the night.

It was dark out, obviously, and there were enough cars on the road that I didn’t have my high beams on.  So when something small and dark entered my field of vision, I barely had enough time to slam on my brakes.  Seriously, this was the hardest I’d ever put my foot down on a pedal in my life.  The smell of burning rubber wafted up into my car and my purse flew onto the floor at my side and the distant car behind me got far closer in the rearview than I would have liked, but I narrowly – narrowly – avoided hitting the skunk that then meandered out from under my bumper and happily went on its merry way into the night. 

I was a little shaken, but after I was sure the critter was well off to the side of the road I continued up the mountain.  I made it a couple more turns before I decided enough was enough and I didn’t want to risk any more heart-attack situations in the pursuit of a nice view I’d seen plenty enough already.  So at the next pull-out I turned around and headed back down the mountain.

At this point I was going five under.  I took extra precautions as I neared the area where I’d just seen the skunk, hoping to see it earlier than I did last time even though I was pretty sure it would avoid the road completely after it almost died.

But here’s the thing about skunks – from the side they’re pitch black.  And while I was driving away that bastard had turned right around just like I did, and it put itself square in the middle of my lane again on my way back down the mountain.

If I thought I slammed on my brakes hard the first time, I was mistaken.  That second time I hammered that pedal to the floor so hard I was pretty sure the car was going to snap. 

But this time I was heading downhill.  The skunk went under my bumper and I felt a little jitter even before the car stopped, and by then I just had to keep moving because it was already past the wheels.

I wasn’t totally sure if I really hit it.  The skunk was in the exact center of the lane so maybe the car just went over it, maybe the jitter I felt wasn’t real, I didn’t know.  I turned the car around again and headed back up the mountain to check because if the skunk was injured I was damn well going to take it to a vet.

But alas, when I got there the poor skunk was lying slumped in the road, and as I slowed to examine it there was no movement, nothing.  Two other cars had come down the mountain while I turned to head back for my skunk, so it might have been one of them that did it, but I’m pretty damn sure I was the one that really killed it.

I didn’t end up stopping at any lookout points.  The cops were still at my favorite spot as I passed them for a fourth time on the way back down the mountain, and I thought to stop and tell them about my poor little skunk, but they looked quite busy with whatever delinquent they’d cornered up there in the parking lot so I let it go.

So really, in the end, I went up the mountain tonight to kill a skunk.  That was pretty much the sole existential purpose of my well-intentioned detour this evening.  I think I’m going to probably go cry a little and sleep it off and try to convince myself it wasn’t my fault.

06 May 2012

URGENT! How to help a stroke victim

Watch me use this blog for some good!

This is easily the most important thing you will learn today – unless you learn how to give CPR or how to solve world hunger or something.  I’m going to tell you how to detect a stroke and how to cure a certain type of stroke.  You can easily save someone’s life with this simple information.

Strokes are caused by a loss of oxygenated blood to parts of the brain, and they can kill you or cause serious lifelong debilitation.  They can be caused by head injuries and the like, or spring up out of nowhere.  Even young people in their twenties can have sudden strokes, so don’t just think it’s an old-people thing.

So first, how can you be sure someone’s had a stroke?  Well, a favorite mnemonic is the first three letters of STROKE:

Smile.  Have the person try to smile at you – check to make sure their face is symmetric and that the smile is natural. 
Talk.  Have the person say a full sentence or two – make sure they are coherent. 
Raise both arms.  Have the person lift both arms above their head.

If a person has problems with ANY ONE of these, get them to a hospital like right freaking now.  If they pass the test, keep checking over the next few hours to make sure things haven’t changed.  Strokes are messy and things will often change.  The moment they do, get that person to a hospital like right freaking now.  (Your time window is a few hours wide at best.)

DO NOT EVER FEEL STUPID OR OVERCONCERNED FOR CARING ENOUGH ABOUT SOMEONE TO TAKE THEM TO THE HOSPITAL.  BETTER SAFE THAN SORRY.

Before I explain the most critical reason why you take them to that hospital (like right freaking now), you have to know there are two major types of stroke: hemorrhagic and ischemic.  Hemorrhagic strokes involve a hemorrhage – a burst blood vessel or something that causes a bleed in the brain (nasty falls often cause hemorrhagic strokes).  Ischemic strokes are caused by a blockage preventing the movement of blood – for example a blood clot that blocks an artery.  (And don’t worry – you shouldn’t have to know these words when you walk into the hospital, they should know it for you.)

If you get your stroke victim to the hospital and tell the staff you think they have had a STROKE (note the emphasis on both of those critical items), the very first thing the hospital ought to be doing is getting an MRI or a CT scan of that person’s brain.  If the hospital doesn’t order a brain image for your stroke victim, INSIST that they do it IMMEDIATELY.

Here’s why: You can tell the difference between a hemorrhagic and an ischemic stroke using these imaging methods.  Ischemic strokes (caused by clots) can be treated with blood thinners, while giving blood thinners for hemorrhagic strokes (caused by bleeds) will kill people.  Hemorrhagic strokes can be dealt with surgically.

So for example, if the stroke is ischemic and you’ve caught it within 2-3 hours, the hospital can administer blood-thinning drugs...

...And that person can walk away from their stroke with no problems whatsoever.  Zero.

That, my friends, is a miracle.  TELL THIS TO EVERYONE YOU KNOW AND SAVE LIVES.

30 April 2012

Another Five Abominable Brain Myths

The day after I posted my first list of five brain misunderstandings that make me cringe, I remembered another cringe-worthy myth and have therefore been forced to compile an additional list.  For the record, #1 on this list really should have been #2 on the Top Five…


#5. Drugs will put holes in your brain.

Sure, hardcore drugs can kill you, but they don’t do it by putting holes in your brain.  I’m pretty sure D.A.R.E. is primarily responsible for spreading this myth, at least according to a few of my fellow nineties kids. 

First, let’s tackle the bit about what a “hole” is.  Generally speaking, the fastest (and only) way to get an actual hole into your brain is with a bullet or a tamping iron or something (see, for instance, Phineas Gage).  Otherwise, what one might call a “hole” is most often really a region of damaged brain which, while damaged and nonfunctional, is still packed with all sorts of fluids and tissue and whatnot. 

Your brain is a very densely-packed organ full of cells, bathed in a fluid called cerebrospinal fluid, encased in a skull.  If you endure damage to the brain that doesn’t open up the skull in the process, then the damaged brain regions are still going to be filled with all that kind of stuff.  If you happen to see an MRI of such a lesion (say, from a stroke or a tumor or something) it might look like a dark space in an otherwise bright brain, but all that means is that the water moving around in that region is not neatly organized like it is in the rest of the brain.

So keeping that caveat in mind, not too many drugs even damage your brain in a way that would create proper lesions like the kind described above.  In fact, I couldn’t come up with one.  Let me give you a list of popular drugs that certainly don’t put holes in your brain even when abused: heroin, cocaine, meth, pot, alcohol, ecstasy, LSD, prescription pills, mescaline, bath salts, roofies… I think I’m starting to stretch it here.  Some of these will mess you up all sorts, but unless they give you a stroke they’re not causing a brain lesion that would look like a hole to anyone.

Still, kids, just say no to drugs.


#4. Your brain is some other-entity that is separate from you.

I’m going to admit I managed to confuse myself no end trying to figure out what I want to say here, so bear with me.

I’m a hundred percent guilty of this idea of an other-entity brain.  I do it all the time in these very posts, suggesting that your brain is something different from you that does its own thing and occasionally gets up to mischief.  It’s called anthropomorphizing, assigning human characteristics to things that aren’t human. 

Anthropomorphism is an easy white lie that allows us to speak simply about complicated processes.  Your DNA wants to replicate itself, your brain decides things, and so forth.  Assigning wants and needs and decisions to probabilistic biological processes is a completely inaccurate representation of what’s really happening.  And I don’t have a problem with it, so long as it’s recognized as a rhetorical device that simplifies a conversation.  But it’s a problem if you see it as the whole picture.

Anthropomorphizing the brain is especially easy, because while a brain is simultaneously just an organ, a big glob of mush inside a person’s head, it is also the very essence of what makes that person that person.  I want to avoid getting into arguments about a soul and whatnot – that’s not what I mean.  I mean that it is easy to think of your brain as “you”, and also easy to see it as only a “thing”, and that makes talking about it difficult.  And complicating this matter further is that pesky word “mind”, which is somehow different still from a “brain” and falls somewhere else on this “you” versus “thing” spectrum.

The concept of “self” would take me all year and a few hundred pages to tap into, so all I’m going to say is that trying to determine who “you” are is a real bitch no matter how you cut it, and your “self” is an ever-changing, many-headed beast that is exceedingly difficult – if not impossible – to define.  And the brain is a vital facet of it.  A brain is nothing more than a bundle of neurons and synapses and electrical impulses, and it is also the material substrate of the emergent Self.  Just like electrons with their particle and wave properties, both the mundane biological Brain and the lofty cognitive Mind have to be thought of as two aspects of a singular whole, not discrete entities.  It’s an ugly and difficult undertaking.

As humans we need to have agents separate from ourselves to explain certain of our actions, like addictions (“I try so hard to stop but my brain just won’t let me”).  Your brain has to be that cognizant little scapegoat and it has to be something separate from you.  And it really feels that way, too.  You feel like there’s some ugly little demon sitting inside your head telling you to do bad things.  It’s that fabled devil on your shoulder exactly.  You can have that cognitive dissonance and I wouldn’t dream of taking it from you.

Here’s the thing.  Your brain isn’t separate from your mind or yourself – it’s all one big package.  At the same time, your brain is not a conscious entity.  Believing that your brain wants or needs or decides, that’s incorrect.  Your brain is a well-organized chemical soup that operates according to certain biological principals, and from that soup your glorious conscious Self emerges.  So you can trust that when I say your brain wants something, I’m only doing it to simplify a point.


#3. The brain is a muscle.

I don’t know whether people saying this mean “the brain is literally a muscle” or “the brain is like a muscle in that the harder you work it, the bigger/better it gets,” but I’m going to shoot down the entirety of the former and a major assumption of the latter.

First off – and I think this one is obvious – the brain is not literally an actual muscle.  Not in any way.  They’re made of wildly different tissue types and everything.

The second statement, that the brain is like a muscle, contains a critical caveat.  In some ways, the brain is like a muscle, in that you have to use it to keep it strong.  But no matter how hard people try to sell you their guaranteed fitness regimen to get you ready for the Brain Olympics or whatever, the unfortunate fact is that the vast majority of it is total bullshit designed to take your money.

I want to be careful what I say because it’s very important not to throw the baby out with the bath water, and I don’t want you to give up on those crosswords just yet.

Here are some things we know about the benefits of mental exercise.  People with higher education tend to fare better cognitively as they age.  So do people with mentally-challenging occupations.  This could be because those things help people get stronger brains, or it could be because people with strong brains tend to get higher education and mentally-challenging occupations.  Also, training people how to do various mental tricks can have long-lasting beneficial results (that’s the whole definition of learning, right?).

Here are some things we know about the limitations of mental exercise.  Generally speaking, a lot of the things we do to hone our mental abilities – crosswords, puzzles, list-learning, et cetera – just make us very good at tasks like crosswords, puzzles, list-learning, et cetera.  In other words, many mental exercises don’t generalize very well across entire cognitive domains like memory or processing speed. The idea of performing a set of discrete tasks to give yourself a better memory in particular is preposterous, so don’t believe any $50 computer program that promises to train you into having a better memory.  It simply doesn’t work that way.  But that’s not to say exercising your brain is hopeless.

If you want to make your brain into a lean, mean, information-crunching machine, then you need to engage it in a variety of novel tasks.  Go ahead and do that crossword – but also learn a new musical instrument and join a chess club and take up painting and go ride a bike.  The most important thing you should do to exercise your brain is to regularly teach yourself NEW things.  Don’t just get good at the same old things.  Get outside your comfort zone.  I mean that.  Get outside your comfort zone.  NEW and DIFFICULT things help your brain improve.

And keep in mind – none of this is going to make it any easier to remember that new acquaintance’s name at a cocktail party.  If you want to do that, go look up mnemonic tricks for how to remember people’s names and be done with it.


#2. Internet IQ tests give an accurate representation of one’s true IQ.

I’m going to show you a bell curve:

The middle line represents the average (µ) – in this case, the average IQ score.  Each line on either side of that represents one standard deviation (σ) from that average, and is always a set number of points.  The colors represent what percent of the population falls between each of those numbers. 

The standard Intelligence Quotient test (IQ test) is designed to fit on such a bell curve.  Researchers have tested thousands of people and then normalized the scores so that the average score (µ) is 100, and each standard deviation (σ) is 15 points away from that.  This means that 68.2% of the population has an IQ from 85–115, 95.4% of the population has an IQ from 70–130, and 99.7% of the population has an IQ from 55–145. 

To rework that just a little, 99.95% of the population has an IQ less than 145.  Keep that fact in mind.

I’m going to use myself as an example here.  I have an above-average IQ.  I don’t know exactly what it is.  The last time my IQ was tested I was five years old, and IQ tests of five-year-olds are notoriously difficult because they tend to vary a lot depending on the environment and the kid’s energy level, etc.  Before I could get my IQ tested as an adult, I learned how to administer an IQ test and now I’m ruined for IQ tests forever because they depend on me not knowing all the answers and tricks.  Nevertheless, based on other standardized tests I’m confident that I have a moderately above-average IQ.

Every single time I’ve taken an IQ test online (even long before I learned how to give the test), I’ve gotten a score anywhere from 140-170. 

That’s impossible.  For me, I mean.  That score would mean I’m smarter than 99.6% – 99.9998% of the American population.  Guys… I’m pretty smart, but I’m not that freaking smart. 

Put another way, an IQ over 155 (the middle of my internet scores) happens for 1 out of every 8,000 people.  There are less than forty thousand people with an IQ over 155 in the entire United States.  Do I really think I’m as good as the top 40,000 in the entire country?  Definitely not.  (And by the way, if I really had an IQ of 170 I’d be in the top 500, which is just laughably funny – hell, the test frankly starts to break down as a good measure once the numbers get that high.)

I’ve tested some people with IQ’s this high, and they are incredibly brilliant.  Incredibly brilliant.  I’d kill to be that smart.  I’ve also tested people with IQ’s around 80, and they’re also pretty darn smart.  I guess what I’m saying is that if you’ve got an IQ above 100, you should be very proud of yourself.  You’re smarter than half your country.  If you have an IQ above 115, congratulations!  You’re in the top 16% and that is very impressive indeed.

But don’t ever trust a free internet IQ test.  These tests aren’t structured like real IQ tests, they don’t probe even a fraction of the cognitive abilities a real IQ test does, they haven’t been correctly tested against thousands of people to get proper averages and standard deviations, and many are designed to inflate your ego because they want you to come back and click some more.  (I don’t want to say they’re always higher than your real IQ, by the way – they’re just not trustworthy and accurate.)

It’s also worth noting that even a standard IQ test probes a variety of cognitive abilities which in many ways have nothing to do with how well you function in society or as a human being.  IQ tests don’t tell how engaging or charismatic you are, they can’t say if you can run a company, they don’t test your ingenuity or your perseverance or your ability to empathize with your fellow man.  Those traits are all at least as important as your “intelligence”.  Some of the most amazing people I've ever met had an IQ less than 70.  So who really cares how many points you can rack up with the click of a mouse?


#1. Some people are “left-brained” and some people are “right-brained”.

How can I put this simply?  THERE IS NO SUCH THING AS LEFT-BRAINED AND RIGHT-BRAINED. 

Oh sure, I know what people mean when they say that.  They mean that some people (“right-brained” people) have brains that make them very creative and intuitive and free-thinking and whatnot – while other people (“left-brained” people) have brains that are more analytical and logical and objective.  Generally the people saying this are the ones who happily label themselves “right-brained” and despise “left-brained” people for being stiffs.

Before I explain the problem I’m going to repeat myself, because you can’t lose sight of this: THERE IS NO SUCH THING AS LEFT-BRAINED AND RIGHT-BRAINED.

This ill-conceived notion came about in the wake of observations that certain brain functions tend to be lateralized to either the left or the right brain hemisphere.  For instance, your right brain hemisphere receives sensory input from and delivers motor commands to the left side of your body, while your left hemisphere controls the right side of your body.

Also, in most right-handed people, language is supported predominantly by left brain regions.  For lefties like me, on the other hand (pun not intended), language is more often spread across both brain hemispheres or dominant on the right – which makes sense, since we write out our language using our left hands and the left hand is controlled by the right hemisphere.

That said, the vast majority of brain functions don’t fall cleanly into one brain hemisphere or the other.  In fact, even for the ones in which one side of the brain does most of the work under normal operating conditions, if that side gets damaged the other side can usually pick up the slack.  (Practically the only cases in which this doesn’t occur very smoothly are in the above-mentioned sensation, motor control, and language production functions).  It is likely that each of the two hemispheres is processing somewhat unique aspects of the same information, because there are two of them and what’s the point of doing the exact same thing twice when you could be getting more out of what you’ve got?  (Ah, anthropomorphizing is so easy!)

The “X-brained” problem really started when poorly-informed people began making up a whole bunch of junk about all sorts of brain functions supposedly being fully lateralized when really they aren’t.   Then another layer was added when people started pigeonholing these constellations of “lateralized” functions into personality types, even though those “types” are inconsistently described and don’t match known lateralization patterns and have never been substantiated by any actual science (quite the opposite, in fact).  Finally and without any supporting evidence, people started arguing that they were “right-brained” or “left-brained” because they never understood math or because they were artistic geniuses or because they were ridiculously super-geeky, and all because we needed yet another label to attach to ourselves.

Whether we realize it or not, we seem to like calling ourselves names.  “Right–” and “left-brained” are particularly ugly to me, because self-labeling in this way is a blatant unwarranted dismissal of half of one’s potential merits.  It’s like saying “I’m just not good at math.”  I hate that phrase.  I’m not saying it’s not sometimes true.  I’m saying that by stating it, you are giving yourself a bye on having to apply any mathematical effort.  If you call yourself “left-brained”, you’re giving yourself an undeserved escape route off the creative path.  Math is not easy.  Creativity is not easy.  Saying you’ve got a certain type of brain or that you’re just not good at something is fabricating an untrue “unconquerable” biological obstacle that you don’t have a right to give yourself.

So what if you’re not good at math?  All that means is you’ve got to work harder, not that you get to quit.  If you’re not good at sports, or you’ve never been artistic, or you just can’t dance, work harder and actually test the bounds of what you can and can’t do.  There’s a difference between recognizing your limitations and giving up before you’ve even started, and “I’m not very good at X” and “I’ve always been X-brained” are both just ways of artificially limiting yourself.

In short, both “right-brained” and “left-brained” are nothing but meaningless self-deprecating insults, and now you should know better.

21 April 2012

The Top Five Brain Misunderstandings That Will Drive a Neuroscientist to the Brink

I was reminded by a comment on my last post that there are a lot of common misunderstandings out there about the brain.  So here they are, the top five TOTALLY RIDICULOUS things said about the brain that make me want to fly into a homicidal rage when I see them being propagated in popular media:


#5. To grow old is to grow senile.

This one is kind of personal because my research is on human aging.  My grand design when I was fifteen was to cure Alzheimer’s disease, so I’ve been cogitating on it for a while.  The truth is that there is a normal, healthy aging process, and there are a host of separate pathological aging processes that unfortunately tend to get lumped in with healthy aging as What Happens To You When You Get Old. 

Diseases like Alzheimer’s and Parkinson’s aren’t normal.  Dementia is not – I repeat NOT – something that happens to everyone.

This isn’t to say that healthy aging doesn’t involve some declines in certain cognitive functions.  First and foremost, the speed of information processing decreases (meaning older adults are simply slower to do things than they used to be, and we young things must cultivate some patience).  Also, older adults have also been reported to show declines in focused attention, meaning they can be more easily distracted.  Finally, older adults tend to show moderate declines in some aspects of memory – for example in remembering certain names, or where the keys have run off to, or coming up with the right word  in a sentence.  Healthy aging is NOT, however, associated with the profound impairments of memory seen in diseases like Alzheimer’s, which at its worst robs people of the ability even to remember who they are.  See the difference?

Older adults also show improvements with age in other cognitive abilities.  Vocabulary and other crystallized knowledge (i.e. knowledge of facts) both increase throughout the lifespan.  Empathy and the ability to reason emotionally and socially are also far easier for older adults than young adults.

We are a society that fears aging and death, so much so that even words like “old” and “elderly” have acquired a negative connotation.  And misunderstandings like #5 here serve only to help propagate this fearful, antagonistic sentiment toward older people.  News flash – unless you plan to jump off a bridge at age forty, you’re probably going to reach old age someday.  It’s a stage of your life, just like childhood or adolescence or middle age.  And, according to a lot of the people I’ve interviewed at my job, it can be totally awesome if you let it.


#4. You are born with all the brain cells you will ever have.

First off, I just want to tackle this notion of being born with all of anything.  In and of itself, that idea is kind of silly, because I think we all recognize that the version of you that existed a day before the miraculous moment you were born is just about the same as the version of you that existed the day after – minus that whole breathing and eating through your mouth instead of your bellybutton thing.  It’s not like your fetal body is busily building and building right up until you pop out of your mother’s vagina and then all of a sudden you’re in decay mode, you know?

So with that aside, I’m here to tell you that not only are you not done making neurons when you’re born – you’re not done when you hit adolescence, or adulthood, or even that dreaded old age.  You are probably done when you’re dead (but then again, that’s leaving aside the whole philosophical argument about exactly when that other mysterious life-capping moment actually happens and whether your cells still do their thing for a few hours after you’ve reportedly kicked the bucket).

The word we’re looking to investigate here is neurogenesis, or the creation of new neurons.  For a long time we couldn’t find any brain regions that continued to make new neurons on into adulthood, but this was mostly a problem of detection – we didn’t have the right tools to find neurogenesis, so of course we assumed it never happened.  Now that we can properly search for it, neurogenesis is cropping up everywhere – in our friend the hippocampus, the cerebellum, any number of cortical regions – and it appears to be a very important part of continued brain maintenance and function.  (Actually, there’s a really cool story here about hippocampal neurogenesis that was still sort of hand-wave-y last time I checked, but even my two fellow brain researchers fell asleep during the talk we went to about it, so I’ve elected not to share it with you fine people.  Count your blessings.)

Here are some fun facts about your brain growth after birth.  Your head is disproportionately big when you’re born, but it’s not as big as it’s going to be when you grow up (otherwise your mum would be complaining a hell of a lot more than she already is).  That increase in brain size from birth to adulthood comes with a slight increase in the number of neurons you have, but more importantly a large increase in the number of connections they make.  In childhood your neurons shoot out all sorts of projections all over the place, and then during adolescence your brain goes through this massive pruning binge to take out the connections that aren’t doing you any good.  Throughout your teenage years your neurons continue to gain efficiency as they get wrapped in layers of cell membrane called myelin, which is required to allow neurons to effectively propagate their signals.  Therefore, your brain isn’t even fully developed until your early twenties.  You heard me right – your early twenties.  And boys, your brains take a couple years longer to get there than girls’ do.  Explains a lot, doesn’t it?

You can tell this is an old chart because the “cell birth” line (among others) doesn't keep on going out through adulthood.  Also, see how we’ve known for a long time that myelination and synaptic elimination and pruning continue into adulthood.

The upshot is that at this very moment, your brain is still making brand-new neurons.  It’s a very modest amount compared with the total number of neurons in your whole brain, true, but new neurons are still being born.  It’s happening right now.  And it’s going to keep on doing that for a very long time, hopefully.


#3. Smoking copious amounts of pot has no lasting detrimental effect on your brain.

Seriously?  Are you high?  First off, if you’ve ever met a chronic pot smoker, you already know this isn’t true.  You know it.  I shouldn’t even have to back this up with studies, but I’m going to anyway so we never have to have this argument again.

Here’s a very select smattering of the results from papers published since 2010 alone: (A) Chronic pot users perform more poorly on measures of executive function (that means planning, reasoning and decision-making) than non-users, and this effect is worse for those who started before age 16; (B) rats given cannabinoids in adolescence showed reversible impairments on many cognitive tasks but irreversible deficits in short-term memory measures; (C) chronic pot use has been associated with reduced hippocampal volume (although, to be fair, having skimmed that paper I have some concerns about the methods – for a better review of the effects of pot on brain metabolism and structure I encourage you to look at this paper); (D) and luckily, treatment for pot addiction with gabapentin significantly reduced pot use and increased performance on cognitive tests.  Hell, if you want a recent review of cognitive deficits associated with pot use, just read this paper.

People make similar arguments about alcohol, and I’m here to say that large enough quantities alcohol will totally mess with your head over time.  (This one I have some personal experience in!  I swear I used to be quicker than I am now, and I’m blaming vodka-tonics.)  I won’t take you through a lit review here, I’ll just leave it at Korsakoff’s syndrome.  If you think that’s unfair, fine, I’ll tell you all about run-of-the-mill alcoholics’ impairments in perceptual-motor skills, visual-spatial function, learning and memory abilities, abstract reasoning and problem solving.  Don’t make me go there.

Please don’t try and justify your habits by saying they don’t affect you.  It’s a no-brainer (pardon the expression) that excessive amounts of intentionally mind-altering substances will, over time, affect the brain and its function.  I’m sorry, but that’s just the way it is.  Just embrace the fact that you’re willingly damaging yourself and move on.


#2. Pretty much anything movies ever say about brain disorders and treatments.

As a writer, I struggle with convincingly portraying anything about which I know absolutely nothing.  I understand the plight.  But let me give you three of my favorite examples that will let you know just how disastrously misguided most movies and shows (and news outlets) are about the brain:

(A) One episode of Boston Legal starts with William Shatner’s character animatedly yammering away in an MRI machine spouting off baseball statistics for a pair of scientists who are quizzing him while watching “brain activation” blobs flit across a still image of his brain.  The scene cuts to a doctor’s office where the neurologist (?) informs him he has Alzheimer’s disease.

PROBLEMS (well, some of them): You cannot move more than a few millimeters in an MRI scanner or the image will be totally messed up; No one watches brain activation patterns real-time because (a) it’s not feasible and (b) some guy watching a bunch of blobs real-time isn’t remotely as valid as running actual statistics on the data; THIS IS NOT THE WAY YOU DIAGNOSE ALZHEIMER’S and it’s a disgrace to imply that it is; Using an MRI to detect Alzheimer’s is only now becoming a realistic possibility but it hasn’t reached clinicians yet and also people are focused not on brain function but structure; Semantic information (like baseball statistics) is one of the few things that’s preserved in early-to-moderate stages of Alzheimer’s; You would never ever tell someone, “the tests show you have Alzheimer’s disease,” because Alzheimer’s can’t actually be diagnosed until you are dead (while you’re still alive you really have probable Alzheimer’s), and a good doctor would break that news more gently and with a caregiver around to help assimilate what the doctor is telling a probable Alzheimer’s patient.

(B) In my favorite example from House, House has been in a bad situation he can’t remember because he was blitzed out on substances at the time.  He gets his buddy to perform deep brain stimulation to jog his memory.  The first jolt allows him to see a fuzzy, silent, black-and-white still image or two, so he says to crank up the juice – and then suddenly he can see the full memory playing like it’s a proper modern movie.

PROBLEMS: I don’t even know where the start.  The House show always pulls out a single term like deep brain stimulation (DBS) and makes a total mockery of it because apparently not a single writer on that show has ever even taken a single class in medical school.  So to summarize a bare minimum of points – first off, your brain simply does not behave in this way when you electrify it.  When electrically stimulated, you will probably perceive some anomalous sensations, but you will not replay the one memory you’re looking for, and it will not be a 1920’s silent film at low voltage and a modern movie a high voltage.  Also, memories do not ever play back perfectly like that for you anyway, and you all know that.  Finally, DBS would never be used for such a purpose – mostly it’s an extreme therapeutic option for people with Parkinson’s disease, severe depression, and certain other disorders, and no one has a good handle on why it works.

(C) Any movie in which “getting amnesia” means “losing all knowledge of yourself and sense of who you are.”

PROBLEMS: This phenomenon of losing yourself is real, and is called a dissociative fugue.  It is psychological in nature and usually occurs in response to a deeply traumatic situation (not a blow to the head).  It’s also transient, lasting a few days or weeks at most.  Amnesia is a totally different thing, best characterized by the movie Memento.  It can be caused by head injury, surgical resection of temporal lobe areas, oxygen starvation, and a number of other insults.  When a neuroscientist (or even Wikipedia) talks about amnesia, by default they mean anterograde (forward-looking) amnesia, in which patients cannot form new memories (remember HM…?).  These patients often, but not always, also show retrograde (backward-looking) amnesia, meaning that they cannot remember past events.  Amnesia is not transient.  These patients will live their whole lives in never-ending cycles of few-minute increments.  Also, they will never forget facts about themselves like their name and birth date and where they grew up, because this is a different type of memory unaffected by amnesia.  This whole movie problem is really one of semantics – how freaking hard would it have been for that first major movie exec to do a little research and say, “Hey, this isn’t amnesia, it’s a fugue – let’s say this guy has a fugue instead”?  They could have avoided decades of ridiculous misunderstanding! 

And let me just make this clear: You will never – under any circumstances – get hit on the head, lose all memory of yourself and your past, and then regain it miraculously a few weeks later.  Never.

(Update: Dammit, okay, I thought of a circumstance.  If the blow to the head is part of a deeply traumatic situation, you could enter a fugue state - but that's not due to the head bump! My point stands.)


#1. You only use 10% of your brain – imagine how much smarter you would be if you used 100%!

You know what happens when you use 100% of your brain?  EPILEPSY.  Freaking epilepsy.  That’s the definition of epilepsy.  If you want to have a gran mal seizure, go right ahead and “use 100% of your brain” all at once.

This “10%” notion gained credence because when we first started imaging human brains (using methods like MRI), researchers found that the little colored blobs lighting up during performance of any given task covered only a small portion of the total brain.  Why?  Because different parts of our brains do different things.  During verbal tasks, language areas light up; during reasoning tasks, reasoning areas light up… you get my point.

Suggesting that we should find a way to use 100% of our brains is like suggesting we should use 100% of our cars when we drive them.  It would obviously be totally efficient if your car drove forward and backward at the same time in all gears while also honking the horn and wiping the windshield and flashing the lights and blaring every possible radio station and opening and shutting the doors.  That’s clearly the best way to get to your local grocery store.  

Our brains are, in some ways, no more than a conglomeration of specialized little parts which each do their own mental tasks.  At any one time, you “use” about 10% of your brain to—

No, you know what?  I can’t even say that oversimplification in good taste.  This whole “10%” thing is 100% bullshit.  What does that even mean, “using 10% of your brain”?  I dare anyone to quantify with our current methods (A) how much of our (B) brains we (C) use at any one moment.  All three of those points need a better definition before you even start down that road, and anyone who takes the first step on it is asking exactly the wrong question based on a host of incorrect assumptions – for one, that the brain is a singular computational entity operating at X capacity like a damn CPU.  The idea is utter crap on every possible level.

Actually, that CPU bit brings me to an important historical point – brains have always been, by default, equivalent to whatever latest technology is hot at the moment.  In this day and age, we equate it part and parcel with a computer.   


It’s such a handy metaphor, right?  Like they were made for each other.  You know what people said your brain was like in the old days?  A switchboard.

 Extra points to the kids who know what a switchboard is.

We don’t have a good sense for what the brain is really doing, so we associate it with metaphors and then generate silly notions based on what we know about those metaphorical devices.  The point is this: Our brains are well optimized just the way they are, using different bits for different jobs.  Just know that you do NOT want all your neurons to fire all at once.  You will immediately die.