Tuesday, October 25, 2011

Surviving Fright Night

  Who's the most likely to get hit by a car?

Check your children's candy before they eat it.  Can they can see through their eye holes?  Better yet, don't let them wear a mask at all.  Make sure that costume isn't flammable.  Now, make sure it isn't inflammable.

We get the same advice from  so-called experts every year on Halloween safety.  How much of it, though, is evidence-based?  Let's run through the potential dangers of Fright Night, and see which ones you should really be worried about:

Tampered treats.  FALSE (mostly).  I think this one has been thoroughly debunked, but for those of you who have never wasted an afternoon on Snopes: A criminal justice professor concluded in a 2008 review of Halloween sadism that no child has ever been killed or seriously injured by a contaminated trick-or-treat sweet.  Only 1 child has died of Halloween candy poisoning, and he was poisoned by his own father. 

That’s not to say that sharp objects haven’t found their way into candy bars and apples.  There is exactly one published case report of an adult whose stomach was perforated by a needle thought to be hidden in a Halloween caramel apple.  Most tamperings end up being hoaxes, though, perpetrated by kids who want to freak out their parents.  Some hospitals go so far as to offer free x-rays of Halloween treats, but two published studies discovered no cases of tampered treats in over a thousand bags of candy.  Alarmingly, the authors of one study hid a needle in an apple as a quality control measure, and one out of the five hospitals tested missed the needle.
 
Drunken teenagers in costume.  TRUE.  I always feel like I’m being shaken down when surly, Goth teenagers appear at my door on Halloween night, thrusting empty pillowcases into my face.  Assuming the Goth get-up isn’t a costume, I should really be thankful that they’re not wearing disguises.  One survey of Halloween behavior in college students found that wearing a costume is significantly associated with alcohol use.  Not only that, but Halloween is one of the hardest drinking times of the year for college freshmen, outstripping even spring break.
 
Sex offenders lying in wait.  FALSE.  Many states have laws against registered sex offenders passing out treats on Halloween night.  There’s no evidence, though, that molesters use this holiday as an opportunity to prey on kids.  One analysis of over 67,000 nonfamilial child sex crimes found that there was no increase on Halloween, even before the advent of these restrictive policies. These sickos are just as likely to strike on Arbor Day as they are on Halloween.

Getting hit by a car.  TRUE.  The Centers for Disease Control reported in 1997 that pedestrian deaths quadruple on Halloween night.  It’s a good idea to have your kid carry a flashlight or glow stick.  Just beware of….

Glow stick injuries.  TRUE (but minor).  My husband experienced this firsthand when my daughter’s glow stick exploded in his face.  He ran screaming to the sink to rinse out his eyes, which were red and painful for the next eight hours. Witnessing their dad’s chemical burn didn’t traumatize my kids in the least.  On the contrary, they were mesmerized by the glowing splatter on the rug, which resembled an alien crime scene.

The number of glow product exposures reported to poison control centers has been increasing over the years, and the largest spike always occurs around Halloween.  Fortunately, no one has ever been seriously injured, including the twelve misguided individuals who swallowed intact glow sticks.

Sporotrichosis from hay bales.  TRUE.  Sporotric--what?  It’s a rare but ugly fungal skin infection transmitted from contaminated plant material. 

                                                Source: Dermatlas.org
One outbreak of sporotrichosis was traced to hay bales from a Halloween haunted house.  And you thought it was just the scratchy hay from the wagon ride making your butt itch!

Pumpkin carving injuries.  TRUE – unless you use Pumpkin MastersTM tools.    While there aren’t any epidemiologic reports related to these injuries, there was a controlled study of kitchen knives vs. specially designed pumpkin carving tools, performed on cadavers who raised their hands to volunteer:

Kitchen knives caused tendon lacerations in 4 out of the 6 fingers tested, while Pumpkin MastersTM caused none.  (Pumpkin KutterTM severed one finger tendon, much to the company’s dismay, as it had donated its tools for the study.)

Pumpkin seed bezoars.  TRUE.  A bezoar is a collection of ingested, undigested material that causes gastrointestinal obstruction.  Bezoars are most commonly made of hair (usually the patient’s own), but come October, these poor disturbed souls switch to pumpkin seeds.  There are multiple case reports of unshelled seeds getting stuck in traffic somewhere along the GI tract, including the rectum.*  I'll spare you the photo, but for those of you who get a kick out of that sort of thing, you can purchase the article itself.

The undead.  TRUE.  You think I’m kidding?  Then why would the CDC post guidelines on how to survive a zombie attack? Some of their tips include, “Make a list of local contacts like the police, fire department and local zombie response team.”  They also recommend having a first aid kit on hand, though they concede that “you’re a goner if a zombie bites you.”  Reassuringly, the CDC has a plan to investigate and contain any outbreak of what they term “Ataxic Neurodegenerative Satiety Deficiency Disorder.”


Now if only they could teach us how to handle drunken, costumed teenagers.

Stay safe, everyone.

*There are also case reports of rectal bezoars due to watermelon seeds, sunflower seeds, popcorn and prickly pear cactus.  No glow stick bezoars, thankfully.

Friday, October 21, 2011

Mattel, Inkorporated

Parents are up in arms over the latest "Gold Label Collector" Barbie, who sports permanent tattoos and leopard-skin leggings, a la Peg Bundy:


She finally did it.  Barbie has out-skanked the Bratz dolls. 

Barbie has dabbled in body art before, but the tattoos packaged with the 2009 Totally Stylin' model were just temporary heart and rainbow designs.  Tattoos are becoming increasingly mainstream, with about 13% of adolescents getting inked.  It's a perennial rite of passage for teens to adopt outrageous trends just to piss off their parents.  The obvious difference, though, between a tattoo and midriff-baring tops is that one is permanent and the other isn't. (Except in cyberspace.  Aren't you glad the Internet didn't exist back when you had a mullet?)  Sure, your kid could have a tattoo removal in the future, but it's expensive and not always successful. The health risks of tattoos, particularly hepatitis B and C, are generally well known.  Some parents, though, feel that if they let their teenager go to a reputable parlor (an oxymoron, if there ever was one), what harm could there be?

Plenty, it turns out.  Several studies have shown a strong correlation between adolescent tattoos and high-risk behaviors.  The largest was a survey of over 6,000 adolescents.  At baseline, tattooed kids were more likely to live in a single-parent household with lower levels of parental education and income.  After controlling for these risk factors, tattooed kids still had higher levels of substance use, violent behavior, early sexual involvement and school truancy and failure.  Another survey found not only higher rates of drug use, violence and sexual activity in tattooed adolescents but also increased risks of disordered eating and suicidal behavior.

Wait a minute, you say, having a tattoo is simply associated with these high-risk behaviors -- surely getting a tattoo doesn't cause them.  That may be true, but there are multiple studies showing that having a visible tattoo affects others' perceptions in a negative way.  In one study, 286 people were asked to describe the personality characteristics of virtual avatars who were identical, except for the presence or absence of tattoos.  Tattooed avatars, especially the female ones, were described as having significantly more sexual partners.  Another study asked college students, a third of whom had tattoos, to describe the personality characteristics of a photographed female model with a large dragon tattoo, compared to the same model with the tattoo Photoshopped out.  The students described the tattooed model as being less intelligent, caring, attractive, fashionable and athletic.  (On the flip side, she was deemed to be "more creative.")  Interestingly, the results were much less impressive when the experiment was repeated with a model who had a small dolphin tattoo.

I've never wanted a tattoo, even as a teenager.  But when I was a college freshman, I had my ears double-pierced.  My conservative, first-generation immigrant parents made me remove the posts so the holes would close up, since "only bad girls" would mutilate their ears like that.  When I tearfully tried to explain that the piercing would in no way make me misbehave, they said, "No, but everyone will think you're a bad girl, and you'll attract only the bad boys" -- precisely the same argument I'm now making about tattoos.  Double earlobe piercings are so mainstream now, even passe, that I don't think their argument holds water any more.  Maybe the same will be true of tattoos in the next 20 years.  But you can rest assured that my kids won't be getting inked as long as they're under my roof.

Unless they want to write "Mother."

Tuesday, October 18, 2011

The AAP Scolds Us -- Again

The American Academy of Pediatrics released an updated policy statement this week on media use in children under 2, and no surprise, they continue to "discourage" it.  This, despite their acknowledgement that "no longitudinal study has determined the long-term effects of media use" in this age group.  (Read my earlier post on the supposed harms of baby videos.)  I read the statement carefully, and it is, shall we say, a quixotic document.  Here's one helpful tidbit on how to watch a toddler without resorting to television: "Simply having a young child play with nesting cups on the kitchen floor while a parent prepares dinner is useful playtime."

The last time I tried that with JoJo, he pulled a bottle of beer out of the minifridge and smashed it on the floor.  So even if T.V. is decreasing the number of folds in his brain, at least I don't have to worry about an episode of Yo Gabba Gabba ending in a trip to the E.R.  Sometimes I think that the AAP just needs to get off its high horse and live the life of a real parent.

Speaking of high horses, some years ago, my 3-year-old son ran up to me as I walked through the front door and gabbled, "Mommy Mommy!  Daddy was watching this T.V. show and a man was in bed and the man looked at his hand and his hand was wet and his hand was covered in blood and he looked up and there was a HORSE'S HEAD!"  I rounded on Rick, "You let J.J. watch The Godfather?!"  He  shrugged and said, "I didn't know he was paying attention."



No more Mafia movies for my kids.

Just adorable, knitted horses' heads.

That, my readers, is an example of background television, and the AAP statement devoted a special section to this form of media exposure.  Foreground television, such as Baby Einstein, is designed with children in mind.  Background T.V. is not designed for kids, but they might be exposed to it for many more hours than foreground T.V.  The question is whether background T.V. affects child development in any way.  The AAP highlighted two studies suggesting negative outcomes, so let's take a close look at these.

In one study, investigators looked at whether the game show Jeopardy affected the solitary play behavior of 1- to 3- year olds.  (I guess the Netflix queue for Godfather was too long.*)  As expected, when the T.V. was on, the toddlers didn't pay much attention to it, spending only 5% of their time with their eyes on the screen.  Correspondingly, the amount of play decreased by 5% with the T.V. on.  The authors made much of the fact that although the quantity of play didn't decrease substantially, play was interrupted more frequently, with play episodes lasting about 30 seconds less with the T.V. on.  They also noticed that "focused" play episodes (defined among other things as the child having a "serious facial expression with furrowed brow") were also shorter -- by 5 seconds.

A similar study, performed by the same group, looked at the influence of background T.V. on parent-child interactions.  When the T.V. was on, parents interacted less with their kids, and the quality of their interactions suffered, with more absent-minded behaviors, such as grooming, and less imaginative play.  Finally, having the T.V. on also resulted in parents responding less to their toddlers' bids for attention.  I always knew my kids tune me out when the T.V. is on, but I guess I'm doing the same to them.

So what's the take home message about background T.V. in young children?  From a purely anecdotal point of view, I wouldn't recommend watching anything with Al Pacino in it while your kids are still awake.  From the evidence-based standpoint, the research is scant but generally shows negative short-term effects of uncertain importance.  How much T.V. is too much? Thirty-nine percent of parents say that the T.V. is on "constantly" in their household.  What the heck, I'm willing to go out on a limb** and say that's too much. 

*Although probably not anymore.  R.I.P., Quickster.  Yours was a short, sad, badly named life.
**I was going to write "stick out my neck," but I've already gone overboard with the decapitation allusions.  I don't want to beat a dead horse. 

Ears Lookin' at You, Kid!*


Last week, JoJo spiked a temp to 102.7.  Since he was already scheduled for his 18-month visit, I had the pediatrician gave him the once over.  She diagnosed him with otitis media, or middle ear infection, but her diagnosis was essentially just an FYI.  Since ear infections often get better on their own, the doc recommended antibiotics only if he didn't improve in the next 48 hours.

I knew she was right.  But dadgammit, I wanted to start those antibiotics so he could get back to daycare ASAP, and I could get a good night's sleep!  It's times like these when it's tough to be an Evidence-Based Mommy.

Here's the argument against routine antibiotics:  Some ear infections are caused by viruses, and as you know, antibiotics don't kill viruses.  Even mild bacterial infections self-resolve at times.  Antibiotics can shorten the course of illness and prevent rare complications such as bone infection and hearing loss, but at the cost of increasing drug resistance.  So what may help a patient on the individual level may hurt patients on a societal level.

A recently updated meta-analysis of eleven randomized, double-blinded trials of over 2000 children with ear infections found only a small benefit with antibiotics: 78% in the placebo groups recovered spontaneously within 2 to 7 days, compared to 84% in the antibiotic groups.  There was no difference in serious complications.  Those on antibiotics also had a 4% absolute increase in vomiting, diarrhea and rash.   So taking the 6% benefit and subtracting the 4% detriment, you get, on balance, a measly 2% absolute benefit from taking antibiotics. A review of four other trials found that there was no difference in starting antibiotics immediately versus waiting 48 hours for spontaneous improvement.  So even from the standpoint of an individual child, you could make a strong argument to watch and wait. There were certain subgroups that benefited more from immediate antibiotics: children under 2 who had infections on both sides, or those with pus pouring out of their ears.  In general, though, antibiotics were almost a wash.

Contrast these results with those of a study published this year.  In this randomized trial of children with ear infections, treatment failed in 45% in the placebo group, compared to 19% in the antibiotic group, with an absolute benefit of 26% - much better than the 4% reported in the past.  Not only that, but antibiotics led to resolution of fever within 6 hours, as well as fewer days of missed work in parents whose kids were in daycare.  Sadly, these benefits still came with a price: a 25% increase in diarrhea in the antibiotic group. So pick your poison: irritable baby or irritable, diaper-changing parent?

Still, you might wonder why the results of this trial were so much more impressive than that of previous studies. The 2011 study had very strict criteria for middle ear infection, which included a pneumatic otoscopic exam (basically, blowing air into the canal and looking for decreased movement of the eardrum).  Although medical students learn this technique, I can tell you that it's rarely used in the primary care setting.  Most of the time, pediatricians are just trying to grab a 1-second peek into a screaming baby's ear, and crying itself pinks up the eardrums by dilating blood vessels.  Studies have shown that these infections tend to be overdiagnosed, especially when earwax is obstructing the view, so it's no wonder that antibiotics are often of marginal benefit.

So what's a parent to do with all this conflicting information?  If your kid's doctor wants to prescribe antibiotics for a middle ear infection, ask her two questions:

1)  Is this a pretty clear-cut diagnosis?  When doctors are "certain" of their diagnosis, the probability of a "real" otitis media is actually 76% -- which believe it or not, is pretty accurate for a diagnosis based purely on history and physical exam.

2)  Do you think it's safe to wait 48 hours to see if my child gets better on his own?

In JoJo's case, I waited, as his doctor thought it would be safe to observe him off antibiotics.  He was better within two days  This time at least, it looks like I picked my poison wisely.**

*Sorry, I know this is a groaner of a title.  As Fred said to George after his ear was sectum sempra'd off, "Pathetic!  With the whole wide world of ear-related humor before you....?"


**Which hasn't always been the case.  I decided not to give my oldest son the antibiotics prescribed to him for an ear infection.  He seemed to improve, but on a routine visit a few weeks later, his pediatrician found he had a persistent infection.  When he puzzled aloud over why my son "didn't respond" to the antibiotics, I was forced to come clean.  My son was deemed cured after a week's worth of treatment/diarrhea, but I've always wondered if my poor maternal decision-making was to blame for his current, frequent refrains of "But Mom, I didn't hear you!"

Wednesday, October 12, 2011

The New Merchants of Death

Why buy your carcinogens when you can get them for free?

Imagine a place where a child is allowed to buy cigarettes for herself, with only a permission slip from her parents.  In many instances, she may smoke her first pack with her mom -- a popular mother-daughter bonding activity. The tobacco industry is completely unregulated, minimizing the risks and touting the health benefits of cigarettes.  A popular T.V. celebrity extols the virtuals of the GTL lifestyle -- Gym, Tobacco, Laundry.

Sounds like a scene from a developing country?  Substitute "indoor tanning" for "cigarettes," and what I've described takes place in all 50 states in the U.S.  The comparison of tanning salons to smoking may sound like hyperbole, but consider the similarities:

Indoor tanning causes cancer. 

A 2007 meta-analysis of 19 studies in over 7,000 patients found that indoor tanning is associated with a 15% increased rate of melanoma, the deadliest of all skin cancers.  That may not sound like much, but when the analysis looked specifically at indoor tanning in those under 35 years old, there was a 75% increased rate of melanoma.  There was also a higher risk of squamous cell carcinoma, which is not as lethal, but more common.

Most of these studies were "case-control," meaning they looked at the rates of indoor tanning in those who had been diagnosed with melanoma versus those who hadn't.  Sure, you could argue that these findings weren't based on randomized, controlled trials, and that people who go to tanning salons are also more likely to sunbathe (just as smokers are more likely to drink and overeat).  But there is also a large body of laboratory evidence that UV radiation, whether natural or artificial, induces skin cell mutations, the first step in carcinogenesis. 

The tanning industry minimizes risks and promotes questionable health benefits.

Just as the tobacco companies marketed filters for "safer cigarettes," so the tanning industry pushes the concept of the "safe tan."  Many companies claim to use only UVA, which is less likely to cause sunburns than UVB.  The problem is that both forms of radiation are carcinogenic.  Moreover, one can still get burned in a tanning booth, and there are even case reports of patients requiring treatment in a burn unit following indoor tanning. 

The other argument for a "controlled" indoor tan is that the increase in melanin protects against burns from natural sunlight.  Many people using tanning booths to prep themselves for sunbathing.  A tan is indeed protective against a sunburn, with a whopping SPF level of 3.  Do they even make sunscreen with that SPF level?  Increased use of tanning beds has also been associated with more frequent sunburns, so any so-called protective effect is a myth.

One fascinating review compared the advertising tactics of the tobacco and tanning industries.  Both, for example, use physicians in their ads:


Text: "After working 16-hour shifts for my residency,
I tan because it recharges me for work tomorrow."

More recently, the industry has trumpeted the benefits of tanning on raising vitamin D levels:

This claim is wrong on so many levels.  Vitamin D deficiency is most frequently seen in the elderly and housebound, not exactly the GTL demographic.  You need only 15 to 45 minutes a week of sunlight in order stimulate adequate vitamin D production; a 15 to 30-minute indoor tanning session is equivalent to a day on the beach.  Finally, only UVB stimulates vitamin D production.  If a tanning salon advertises that it uses only UVA, then the vitamin D argument is patently false.

The tanning industry targets youth. 


A recent study showed that 10% of kids ages 12 to 18 have used a tanning bed at least once in the previous year.  The figure is highest in teenage girls ages 15 to 18, with a 25% indoor tanning rate.  The incidence of melanoma is increasing more rapidly than that of any other cancer in the U.S.  Take a wild guess which population is fueling this rise: young females, with a 2.7% annual increase.

Everyone in my generation remembers Joe Camel.  But even old Joe wouldn't stick his oversized proboscis into a kids' magazine.  Not so with tanning salons.  A survey of high school newspapers in the Denver area showed that almost half carried ads for indoor tanning:



Indoor tanning can be addictive.  


In a survey of 229 college students who used sunbeds, almost 40% met psychiatric criteria for addiction to indoor tanning.  When we're exposed to light, our brain produces melanocyte-stimulating hormone (MSH) in order to ramp up production of skin melanin.  A byproduct of MSH production is beta-endorphin, a natural opioid.  Many users report a sense of relaxation and well-being following a round of indoor tanning, and you can actually block this euphoric response by administering an opiate antagonist.

Daylight saving time is ending soon, and many will be tempted to catch their rays indoors.  This week, California became the only state in which minors will not be allowed to indoor tan (starting in 2012), even with parental permission.  For those of you in the remaining 49, ask yourself: Would I buy cigarettes for my teenager?  If the answer is no, then you know what to do with that tanning permission slip.

Monday, October 10, 2011

Banishing the Boo-Boos

When J.J. was four, he had a psychotic break at the doctor's office when they tried to give him his shots.  It took four adults to hold him down, and for years afterward, at every visit the nurse would chirp, "Oh, here's the strong one again!" Now that flu season is approaching, trypanophobia* is setting in again.

Then I saw a blurb in a parenting magazine about Buzzy, a vibrating ice pack used to reduce the pain of needlesticks:

"Getting shots is fun with Buzzy!"

There have been two small, randomized studies on Buzzy showing positive results, the first performed on adults using a crude but charming prototype:

Buzzy reduced the pain of IV insertion by a modest 1 point on a 10-point scale.  I wanted to try it on my own kids until I saw the $35 price tag.  For that amount, I could buy them a bottle of whisky and bullet to bite on.

The science behind Buzzy is that applying alternative sensations to the poked arm will keep the nerve fibers occupied, reducing conduction of pain impulses.  It's an often used strategy in pain control, though the efficacy varies widely according to the population being studied.  Another study applied this "vibrating instrument" to the opposite extremity to distract from the pain of immunization:

Combined with other measures, this contraption reduced pain in young children.  So go ahead, moms, feel free to bring your "personal handheld massagers" to the doctor's office!

You might wonder how researchers measure pain.  Pain is difficult to study, and even more so in children.  Pain scales using happy and sad faces can be used in older kids and their parents, but you have to get creative in infants and toddlers.  Scientists can measure duration of crying, heart rate and oxygen levels.  (You know that prolonged silence before a baby launches into The Serious Wail?  Well, he's holding his breath, and his oxygen levels are dropping.)  My personal favorite is the Neonatal Facial Coding System, which looks at 9 facial features to assess if an infant is in pain:

Though it's probably sensitive, I have reasons to doubt its specificity.  Here's JoJo after getting doused with a hose:

His score is a full 9, but he's not in pain -- only cold, wet and humiliated.

There are hundreds of randomized, controlled trials looking at ways to reduce needlestick pain.  The best validated is sugar water in infants, with at least 44 studies enrolling nearly 4000 babies.  No one knows for sure why this works.  It's not due to sucking, as sugar water is superior to breastmilk, formula or water, and it doesn't even have to be given during the shot.  Two milliliters given right before the needle is effective.  One theory is that the sweet taste results in a release of endogenous opioids, the same mechanism by which chocolate is thought to work.  But when one group of doctors gave Narcan, the heroin antidote, to babies receiving sugar water, they still cried less than babies who didn't get the sugar.  Anecdotally, sweet solutions are amazing. I gave J.J. a juice bottle throughout his circumcision, and he didn't make a peep. (That's right, I assisted in his back-alley circumcision -- but that's a subject for another blog post.)

The data on sweet solutions are more mixed in children over the age of 1.  Browsing through the abstracts on Pubmed, here are some of the strategies that have generally been shown to reduce needle pain, based on randomized, controlled trials:**
Others have shown more mixed results:
    And these are the useless ones:
      Finally, have you ever considered leaving the room while your child is being poked?  My husband does it all the time, since he gets faint at the sight of a needle. One randomized, controlled trial of parental presence vs. absence found that having a parent in the room during immunization increases the level of "behavioral distress," particularly crying, in kids 4 and over.  There was no difference between the two groups in the average heart rate, suggesting there was no true difference in their levels of acute pain.  Despite the subjective signs of increased distress, when asked afterwards whether they would want their parent with them during future shots, 86% of the kids said that they would.  I love that there's proof that kids play to their audience, and yes, I admit, it does make me feel needed.

      *Fear of needles
      **Unfortunately, all of these RCTs are small and prone to publication bias (i.e., a small positive study is more likely to be published than a small negative one).  They are also plagued by lack of blinding, and placebos reduce pain an average of 30-40%.  You could argue, though, that since most of these interventions aren't harmful, you might as well use them for their placebo effect.

      Monday, October 3, 2011

      Searching for Bobbie Fischer (or, Why Can't Girls Play Chess?)


      My daughter Sarah started playing chess at age 4, after being taught by her older brother and nanny. Within months, she was trouncing 11-year olds, and J.J. quit playing in frustration.  She's since joined a chess club and started private lessons.  Though I'm proud of her, I secretly wonder whether this will end up being a waste of time and money.  After all, only 5% of registered tournament players are women, and there is only one woman ranked among the top 100 players in the world.  The average rating based upon tournament wins is significantly lower in women than in men.  Are girls destined to fail at chess?

      Let's start with the most discomfiting argument -- that men are better in chess because of their superior intellect.  In reality, the average IQ in men is the same as in women.  However, boys have a wider natural variability in intelligence than girls; there are more males clustered at the top and bottom of the spectrum.  The gender that produced the stars of Jackass is just as capable of producing a Fischer or a Kasparov.

      And then there's a purely statistical reason why elite chess players are typically men.  (Math phobes, feel free to skip this paragraph.)  Even if men and women had the same average ability, with the same inherent variability, the larger group will have the very highest and very lowest performing individuals.  For those of you who are graphically minded, the bell curve for men would have longer "tails" on both sides than it would for women.  (Presumably, though, the lowest performers would get sick of losing and quit.)  Some number crunchers studied 120,000 registered players (113,000 of whom were men) and found that the higher rating of the top 100 men, compared to the top 100 women, was almost completely explained by this statistical phenomenon.  ("Participation rates and gender differences in intellectual domains.")

      Still, this study of the highest end of the spectrum can't explain why the average tournament rating is higher in men.  There are other areas in which males outperform females.  Boys consistently score better in tests of visual-spatial abilities, including "mental rotation," as well as aggression.  All of these skills are vital in chess.   Performance in chess is also closely linked to the amount of practice and the number of tournaments played.  Perhaps girls are not as monomaniacal as boys?

      What about the possibility of gender stereotyping and discrimination?  It doesn't help when superstars like Kasparov say things like, "there is real chess and there is women's chess....chess does not fit women properly."  Still, most chess clubs don't exclude girls, and tournament ratings are objective measures.

      One fascinating study paired 42 female tournament players with 42 male players, matched for skill.  ("Checkmate? The role of gender stereotypes in the ultimate intellectual sport.") The pairs played each other online.  In the first scenario, the men were assigned gender-neutral screen names.  Women won 50% of the games, as one would predict.  In the next scenario, the women were told they were playing other women, even though they were actually playing the same male partners.  Again, they won 50% of the games.  The women were then told they were playing against men -- again the same male partners -- but this time, they won only 25% of the games!  Not only that, but the women played more defensively and less aggressively when told they were playing men. 

      So are girls being undermined by their preconceptions that they will play poorly against boys?  The lack of female role models and peers must have some psychological impact.  One study seems to support this theory:  In chess clubs in which at least 50% of the players are girls, boys and girls performed equally well.  In clubs in which girls are the minority, boys had higher ratings than the girls.  ("Sex differences in intellectual performance.")

      The good news is that the number of girls playing chess is creeping upward, and interestingly, the gender gap in average tournament ratings is starting to shrink.  When surveyed, younger women are also less likely than older ones to believe the male dominance stereotype, which could only serve to raise their confidence.  Naivete on their part?  Perhaps.  But I don't plan on telling my daughter otherwise.