Thursday, March 8, 2012

It's All Fun and Games, Until Someone Loses...

My husband did that to our son.  Before you call CPS, allow me to explain: JJ was hit in the eye by his dad's fly ball.  Fortunately, he didn't suffer a serious injury, and by the next day, he was proudly showing off his shiner at school.

With opening day approaching and JJ now playing AAA ball, I'm getting more nervous than a mole in a Chuck-E-Cheese arcade. (I'm the kind of gal who ducks when a frisbee is thrown at her).  Those balls are being thrown faster and wilder, now that it's 100% kid pitch.  My one consolation is that he's not playing football.  In fact, baseball has one of the lowest rates of injury among youth sports.  The problem is that it has the highest rates of facial injuries, including fractures and eye and dental injuries.

This is what happens to an eyeball when it's struck head-on by a 66-mph ball:

Most kids my son's age can't pitch faster than 50 mph, but 66 mph is well within the range of 13-and 14-year-old pitchers.  While this degree of deformation rarely leads to globe rupture (think of a squished grape), it can cause retinal detachment and vision loss.

There are two ways to reduce the risk of baseball facial injuries: safety balls and faceguards.  Safety balls include the reduced-impact balls, which contain a polyurethane core instead of yarn wrapped around cork.  There haven't been any randomized, controlled trials of this preventive equipment.  In fact, one group of researchers approached a youth league in Indiana about performing an RCT of faceguards, but the league "refused to cooperate," so  they ran a nonrandomized study instead.  The 136 coaches who chose to make faceguards mandatory reported fewer facial impacts or injuries (12.3%) than the 102 who made them voluntary (15.7%).  Most of these injuries were minor, as only 10 children (8 of whom were on the control teams) sought medical treatment.

USA Baseball commissioned a more rigorous observational study looking at the rates of ball--related injuries and the effectiveness of safety balls and faceguards.  The authors used the data on injuries compensated by Little League's insurance for their calculations, so by design, these were more serious injuries. The overall risk of ball-related injury resulting in compensation was extremely low -- 28 per 100,000 players per season.  As expected, the risk of injury increased with the level of competition, while the use of safety balls and faceguards decreased.  After adjustment for level of competition, safety balls were found to decrease the rate of injury by 23%, and faceguards by 35%.

Despite this data, safety balls and faceguards aren't mandated by most youth leagues.  Safety balls just don't bounce like hardballs, and the most common argument against faceguards is the reduction in visibility.  Recommending faceguards won't significantly increase their use; leagues would have to mandate them, to level the playing field.  I would love to see a randomized, controlled trial of faceguards, not for safety reasons, but to see whether they affect one's batting average.  For younger kids, faceguards may have the advantage of reducing their fear of the ball, and possibly increasing their chances of hitting it.  As a team photographer, I have too many shots to count of 6-year-olds swinging with their eyes shut.

The last argument against faceguards is purely from a numbers perspective.  With an injury rate of only 28 per 100,000, and a 35% reduction in risk, you would have to add faceguards to 5,714 helmets to prevent one injury.  It would be helpful to know if leagues that require safety balls and faceguards pay less in insurance premiums than those that don't.  And it sure would be nice if insurance companies would cut them a break for making the safety of their players paramount.

Friday, February 10, 2012

When the Zebra is a Horse

Is this a zebra, a horse, or a Photoshop disaster?*

Here's a trivia question for the doctor-folks who follow my blog:  What's the most common cause of encephalitis of uncertain etiology in children and young adults?  Hint: It's not herpes, rabies or even West Nile.

It's anti-NMDA receptor encephalitis.

If you already know what I'm talking about, I'm impressed.  I had never heard of this entity until today, when our chair forwarded everyone in the medicine department a report from the California Encephalitis Project (CEP).  In fact, anti-NMDAR wasn't even discovered until 2007.

Encephalitis is inflammation of the brain, resulting in fever, confusion, seizures and sometimes permanent neurologic damage and death.  It's usually caused by viruses and sometimes bacteria, when it accompanies meningitis.  Anti-NMDAR is not an infection, however.  NMDA is a ubiquitous neurotransmitter, and in this disease, the body produces antibodies that attack the NMDA receptor, leading to gross neurological dysfunction. 

Anti-NMDAR can be a devastating illness.  It presents with hallucinations, language problems and vital sign instability, among other signs and symptoms.  In the CEP report, 40% of patients required life support for respiratory failure.  Fortunately, there is effective treatment for anti-NMDAR.  The majority respond to a potent cocktail of immune suppression -- the exact opposite of how you would normally treat infectious encephalitis.

And here's the surprising part of the report:  Out of 761 cases of encephalitis reported over the 3 to 4 year period, 47 were tested for NMDAR because of clinical suspicion.  Thirty-two, or 68%, tested positive.  In fact, even though only a fraction of cases were tested for anti-NMDAR, it was the most common cause of encephalitis in this case series.  Two-thirds of cases were in children.

Now encephalitis is still a very rare illness, with an incidence rate of 1 out of 200,000 in the U.S., so even if anti-NMDAR is a common cause of encephalitis, it's still an extremely uncommon disease.  (Please do not charge into your pediatrician's office demanding to have little Johnny tested because he has a fever or is moodier than usual.)  The CEP only collected cases of encephalitis of unknown etiology, so many of the diagnosed viral cases were excluded from this analysis, artificially inflating the relative frequency of anti-NMDAR.  On the other hand, because so few doctors are aware of this disease, we are probably undertesting and missing cases.

In medicine, we have a saying:  When you hear hoofbeats, think of horses, not zebras.  In other words, don't go on an expensive hunt for a rare disease unless you have a compelling reason to do so.  When it comes to childhood encephalitis, though, anti-NMDAR may turn out to be the proverbial horse.

*It's a zorse (zebra-horse hybrid).

Wednesday, February 8, 2012

It's Not Easy Eating Green

A reader sent me a recent article about methods used throughout history to get children to take their medicines.  While the piece focused on sweet talk and sweeteners, the biogeek in me was drawn to a comment about a new compound, GIV3616, that blocks the bitter taste receptors on the tongue.  A scientist from Givaudan Flavors, developer of this chemical, noted that for veggie-phobic children, "We’d like to be able to make their diets more enjoyable by masking the off-putting flavors of bitterness. Blocking these flavors we call off-notes could help consumers eat healthier and more varied diets."  I was dying to get a hold of this bitter blocker to sprinkle on my picky daughter's kale, but that would probably alienate those of you who eschew genetically modified, irradiated, non-organic, non-sustainable, non-locally produced Frankenfoods.  Besides, when I did a Google Scholar search on this compound, nothing has been published -- not even animal safety studies.  (I'm sure it's proprietary.)

It turns out there's a cheaper, low tech solution to increasing vegetable consumption in kids.  This week in JAMA, a psychologist, an economist, a marketing professor and two nutritionists walk into a bar published a study in which they placed photographs of vegetables in elementary school lunch tray compartments.  They measured vegetable consumption on a day when the trays had the photographs and compared it to a day when they didn't have the photographs.  These valiant researchers (or more likely, their undergraduate assistant in charge of "data acquisition") went so far as to scrape off and weigh uneaten vegetables left on the trays, tables and floors.

Does your kid's lunch tray look like this....

...or this?

So what did they find?  On the positive side, the percentage of children scooping green beans and carrots into those compartments went up from 6-12% to 15-37%, and overall consumption increased modestly.  On the downside, a lot of the kids left their veggies uneaten (in fact, more carrots were wasted in the photograph group), and even with the overall increase, consumption still did not meet government recommendations.  The study was also performed over a mere two days.  Kids will figure out in no time that no one's going to punish them if they scoop pudding into a green bean compartment.

As for me, I'll keep waiting for that magic bullet.  To the marketing geniuses at Givaudan:  hurry up, rechristen your license-plate chemical "Flavia," and release it to the general public. And  one more favor, if you please: Publish a study showing your bitter blocker won't make my daughter grow a third eyeball.*

*What's more likely is that she would grow more bitter taste receptors in response to chronic blockade.  (That's the mechanism by which people become tolerant to narcotics or alcohol).  If she were to suddenly stop using the chemical, she would be more sensitive to bitterness than ever before.

Thursday, January 26, 2012

Talking to Your Girls: The Best Vaccine of All

In a previous posting, I noted that there are almost no studies looking at whether giving the human papillomavirus (HPV) vaccine to adolescents increases unsafe sexual behavior -- what some call a passport to promiscuity for girls and a license to drill for boys.  Although there are still no studies looking at behavior post-vaccine, the Archives of Pediatric and Adolescent Medicine just published a study on the attitudes and beliefs of girls who got the HPV shot.
The authors asked the girls (ages 13-21) to agree or disagree with statements like, "After getting the shot against HPV, I am less worried about getting a sexually transmitted disease other than HPV," and "After getting the shot against HPV, I think that condom use (or having fewer sexual partners) is less necessary."

First, the good news:  Only 4% of the girls felt less of a need to practice safe sex because of the vaccine.  The investigators then looked at what factors were associated with this belief.  Some risk factors were predictable, but a few were surprising.  A lower perceived need for safe sex was associated with:
  • Lower knowledge about HPV and the vaccine
  • Lack of condom use at last intercourse
  • Lack of maternal communication about the HPV vaccine
  • Teacher or physician serving as the source of HPV vaccination information
The last one threw me for a loop.  Surely the most reliable information about the vaccine and STDs comes from doctors and sex education teachers?  The problem is that even if they are dispensing appropriate advice (which may or may not be a correct assumption) , they may not be doing so in a way that's understandable to teens and their mothers.  On the flip side, it looks like moms can have a positive impact in their daughters' behavior, particularly if they talk to them about the limitations of the vaccine, including its lack of protection against some HPV strains, other STDs and pregnancy.

Though very few girls agreed that the vaccine would allow them to have more unprotected sex, survey answers don't necessarily predict behavior.  Even before the vaccine, over half of the adolescents in this clinic were sexually experienced, and most were not using condoms reliably.  It's doubtful that the vaccine will decrease the rate of unsafe sexual practices, unless it's accompanied with appropriate counseling.

So does this mean we shouldn't be vaccinating our daughters against, because of the theoretical increased risk of unsafe behavior?  Of course not.  The advent of effective antiretroviral therapy for HIV in the late 1990's was accompanied by an increase in the rate of unprotected sex, and subsequent gonorrhea and syphilis epidemics, in gay and bisexual men.*  Yet it would be completely unethical to withhold effective drug therapy because of its unintended behavioral consequences.  No, it simply means that doctors and, more importantly, parents have our work cut out for us when it comes to educating our kids.

*Despite the increases in other STDs in the San Francisco Bay Area, there was no increase in the HIV incidence in gay men during this time period.  The theory is that HIV-positive men had unsafe sex only with HIV-positive men, and HIV-negative only with HIV-negative.

Tuesday, January 17, 2012

Does Being a Mom = Letting Yourself Go?

I missed my 20th college reunion, having given birth just 6 weeks prior.  I wasn't worried about travelling with a newborn.  I just didn't want to chance a meeting with any old exes, while I was looking puffy, poochy and sleep-deprived. (On the plus side, I was rather buxom). Afterwards, I prevailed upon a friend to dish on all our old classmates.  As he ran the "hot or not" list, I noticed that the former was comprised mostly of his single, childless pals, while the latter was made up of couples with kids.  It wasn't an entirely fair comparison, as my friend is gay, and he and his buddies start with a level of fabulousness lacking in us breeders.  But is it a given that having children will make you look older, fatter and less attractive?

This is what having 2.5 kids does to you.*

Let's start with the easiest question to answer:  Do children make you fat?  In a word, yes.  Numerous studies have shown that mothers have a higher body mass index (BMI) than non-mothers.  Those pregnancy pounds don't just convert into postpartum pounds; they translate into post-post-postpartum pounds.  One study looking at 2,000 women over 65 years of age found that the risk of obesity increased 7% with each live birth.  Polishing off those half-eating chicken nugget meals obviously takes its toll, not to mention the lack of exercise.  Several studies have found that physical inactivity increases with the onset of parenthood.  One study in over 8,000 young women found that when someone becomes a mother, her risk of physical inactivity rises from about 40% to 60%.  Although moms bear the brunt of the weight burden, dads aren't exempt from the curse:  Men also have a 4% increased risk in obesity with each additional child.

What about looking older?  Before my first son was born, I had a grand total of three white hairs; within a year, I was spending 20 minutes a day plucking them.  Nine years later, I've stopped, with the fervent hope that my bald spots will recover.  I couldn't find any studies on whether parenting causes premature greying.  While acute, severe stress can cause diffuse hair loss, there's no evidence that chronic, low-lying stress induces baldness.  What about wrinkles, or other facial signs of aging?

One well-designed study looked at the effect of environmental factors, including parenthood, on perceived age.  Nurses were asked to estimate the age of over 1,800 elderly twins based on facial photographs.  I was relieved to discover that the number of children did not affect perceived age.  Then again, all of these twins were at least 70 years old, so maybe other, stronger environmental factors, such as smoking and sun exposure, obliterated any small effect fecundity might have had on appearance.  (One silver lining, for those of us still working off that baby bump:  thin people in this study looked older than chunky ones.)

A similar study was performed in Shanghai in 250 women, aged 25 to 70 years.  The researchers didn't look specifically at whether the number of children affected one's perceived age, presumably because of China's one-child policy.  However, they did find that having more than 3 members in your household made you look, on average, 2 years older.  Whether this was due to the stress of having that extra, illicit child, or more likely, an aging parent or in-law in the home, wasn't clear, but it might be a good reason to push your kids out of the nest when they turn 18.

Finally, for those of you with babies or young children, don't underestimate the benefit of a good night's sleep.  The research on beauty sleep is scant, but leave it to the Swedes to publish the one controlled study on this topic.  Twenty-three young adults were photographed after 8 hours of normal sleep, and again after a night of only 5 hours of sleep.  The photographs were then rated by observers blinded to the intervention.  Sleep deprivation reduced attractiveness by about 4% and perceived health by about 6%. 

Despite the meager evidence, I'm convinced that having kids makes you look frumpier, but the trade-off is worth it.  My advice is simply to put on some make-up, lose the mommy jeans, eat healthier, exercise more, get eight hours of sleep every night....

Oh, forget it.  Just call the plastic surgeon and be done with it already.

*Actually, this photo came from Faces of Meth.  But who's to say she didn't have a baby in the intervening years?

Tuesday, January 10, 2012

Catching Your Death of Cold

Growing up, many of us were admonished by our moms to wear our coats, or we'd "catch our death of cold."  Chinese grandmas have a particular fondness for bundling kids so tightly that they're splinting their joints.  We know now, of course, that being cold doesn't cause colds; viruses do. 

But like many things in science, the evidence isn't as clear-cut as you might think.  Multiple studies have found a strong link between outdoor temperature, as well as low humidity, and the risk of respiratory infections.  One study, for instance, found a 4% increase in upper respiratory infections with every 1 C decrease in temperature.  Influenza shows strong seasonality in temperate regions, with peak infection rates in winter, but none in tropical areas.  Of course, this doesn't prove cause and effect.  The conventional wisdom is that colds and the flu are more common in winter because of an increase in indoor crowding.

There is another possible explanation for this phenomenon.  Our first line of defense against respiratory viruses is that cozy mucus blanket lining our noses.  Microscopic hairs sweep the germ-laden mucus towards the back of the throat, where it's swallowed and sterilized by stomach acid.  "Nasal mucus velocity" drops significantly in cold weather, preventing viral clearance.  (Sounds like one of the less popular Magic School Bus episodes.)  Cold also impairs the function of macrophages, white blood cells that ingest germs.  Finally, many respiratory viruses replicate best at lower temperatures.

Guinea pig studies do confirm an increased risk of transmission of the flu in cold, low humidity lab conditions.* 

However, there are no studies on the protective
properties of the Snuggie, in guinea pigs or humans.

While there are no human trials on the direct effects of low temperature on influenza infection, controlled studies exist for the common cold.  In one 1968 study, forty-nine "volunteers" from the Texas State Department of Corrections were nasally inoculated with "virus-containing fluids" collected from sick Marines.  Half were then subjected to cold conditions, involving, among other things, sitting in a 4 C (39 F) room in shorts and undershirts for a couple of hours.  The two groups showed no difference in the rates of rhinovirus shedding or cold symptoms, and you can bet this study had 100% follow-up.

Another study took the opposite approach, looking at the effects of hot, humidified air in university students who were also experimentally infected with rhinovirus.  This time, the subjects were comfortably ensconced in private hotel rooms and administered either placebo vapor or warm steam.  There was no difference in the primary outcome of viral shedding.  Another randomized, double-blind trial found a reduction in cold symptoms with hot, humidified air, but again, no decrease in viral shedding.

Case closed, right?  Believe or not, research in this area continues, almost 2,000 years after the Greek physician Galen wrote about the four humors ("phlegmatic" being the "cold and moist" humor).  The latest was a study published by the Common Cold Centre in the UK, which randomized180 volunteers to place their feet in cold water or an empty bowl for 20 minutes.  Why did the investigators decide to chill feet instead of noses?  Their rationale was that chilling of the body surface decreases the temperature of mucosal surfaces, via reflex constriction of the blood vessels in the nose.   Over the next five days, significantly more of the chilled subjects developed cold symptoms than the control group - 29% vs. 9%.  Despite the relatively large sample size and achievement of "statistical significance," this study sounds like an grade school science project, and not a winning one at that.  The subjects were aware of the hypothesis of the study -- that chilling might affect the development of cold symptoms -- so the nocebo effect may have been in play.**  None of the volunteers underwent viral cultures to confirm infection.  And it's hard to believe a mere 20-minute foot dip could triple your chances of getting sick.  So is bundling up really going to protect you from infection?  Probably not.

Then again, would it kill you to listen to your mother?

*Guinea pigs were discovered in 2006 to be an excellent experimental model for the flu.  I have no idea why it took so long figure this out, when guinea pigs have been, well, guinea pigs since time immemorial.
**A nocebo is the opposite of a placebo: something that makes you feel worse, though it has no actual, independent effect.

Tuesday, January 3, 2012

Game On!

My husband and I caved in and bought handheld Nintendoes for our kids this Christmas.  (Thank you, Craigslist!) Though we usually set a 15-minute limit on schooldays, we let our kids play as much as they wanted over the holidays.  When I couldn't stand the peace and quiet anymore, I shooed them out of the house for a little sunshine and exercise.  I quickly learned the downside of handheld gaming devices:

So did those hours of nonstop gaming irreversibly damage our children's vulnerable brains?  Don't worry, I won't rehash the numerous studies linking video games with aggression, attention problems, obesity, depression and suicidality, poor academic performance, addiction, reckless drivingsmoking, alcohol and drug use, hypertension and high cholesterol, inadequate sleep, seizurespulmonary embolism*, ruptured eyeballs, internal bleeding**, five out of the seven deadly sins (greed, sloth, gluttony, lust, avian anger) and the decline of Western civilization.

Because that could get really tedious.

Let's focus instead on the benefits of video games.  As you know, I'm the master of finding evidence to back up how I would parent anyway.  And just to make it a little more challenging, I'm excluding studies of educational games.

Although traditional video games have been linked to obesity and physical inactivity, the newer generation of exergames, like the Wii, have the potential to reverse the trend.  But how much energy is actually expended playing?  A meta-analysis of 18 pediatric studies found that on average, gamers achieve 3.3 "METS," or metabolic equivalents, of activity, which is the equivalent of brisk walking or skipping.  A 60-pound kid would have to play for 90 minutes to burn off one single-serving bag of Doritos (150 calories).  Dance Dance Revolution uses the most energy, followed by Wii Boxing, Wii Tennis, and Wii Bowling.  Only in the virtual world could bowlers be considered reasonably fit.  Unfortunately, video games did not reduce body mass index.

Traditional video games do seem to improve visuospatial skills.  One trial in 10-year-founds found that playing the game Marble Madness improved tests of spatial skills compared to those who played computer word games.  The benefits were greater in girls, who have lower spatial skills at baseline.  Another randomized study found that undergrads who played a total of 6 hours of Tetris improved in tests in mental rotation and spatial visualization, compared to no improvement in those who were assigned to no video games.  Habitual video game players also test higher in measures of visual attention.  They're better at identifying multiple targets and ignoring visual distractions, and they have faster reaction times. In other words, playing video games makes you better at....playing video games.

So can these skills be translated into something marketable?  Well, if you're thinking about getting your tubes tied, maybe you should let your teenager take a crack at it.  One study found that teenaged expert video gamers outperformed obstetric interns on a laparascopic simulator.  (Laparoscopy is surgery performed through 1-inch incisions, with camera guidance.)  At least 10 other studies have found a positive correlation between surgical skills in doctors-in-training and video game experience.  I didn't find this too surprising.  I've done a few sigmoidoscopies (i.e., partial colonoscopies), and they're a lot like video games, only smellier.

And that's about all I found in my research.  I have to admit that the evidence to support gaming in kids isn't the strongest.  At least there aren't any studies showing that video games cause cancer.


*A 24 year-old died after playing video games for 80 hours straight.
**A woman fell off her sofa while playing Wii Tennis.  You can get anything published these days, as long as the mechanism of injury is novel.