The Coerciveness of Public Health

This morning I awoke to the news that Chris Christie, the governor of New Jersey, thinks parents should have a choice about whether to vaccinate their children. He has since backtracked on his statement and affirmed his support for vaccination. But as a measles outbreak spreads across California, Arizona, and twelve other states, it’s exposing the tension between personal autonomy and community well-being that’s an ever-present part of the doctrine of public health.

needle

The current measles outbreak most likely started when a single infected individual visited Disneyland over the holidays, exposing thousands of vacationers to a highly communicable disease that the CDC declared eliminated from the U.S. in 2000. At another time—say, ten years ago—the outbreak might have been contained to a handful of cases. But as numerous media outlets have reported, immunization rates have been dropping in recent years, particularly in wealthy enclaves where parents still believe the debunked link between vaccines and autism, aim for a toxin-free lifestyle, or distrust Big Pharma and the vaccine industrial complex.

I am young enough to have benefited from the scientific advances that led to widespread immunization in the 1970s, and old enough to have parents who both had measles as children and can recall the dread surrounding polio when they were growing up. Vaccines are a clear example of how public health is supposed to work. One of the unambiguous public health successes of the twentieth century, vaccines have transformed ailments such as pertussis, diphtheria, and chickenpox from fearsome childhood afflictions that could cause lifelong complications, and even death, to avertible diseases.

The basic premise of public health is the prevention of disease, and public health guidelines have led to increased life expectancy and decreased incidence of communicable illnesses, as well as some chronic ones. Yet public health regulations have always had to balance individual civil liberties with public safety. People are free to make their own choices, as long as they don’t infringe on the public good. For the most part you’re still allowed to smoke in your own home (although your neighbors could sue you for it), but you can’t subject me to your secondhand smoke in restaurants, bars, or office buildings.

I believe in handwashing, USDA inspections, the use of seatbelts, and the pasteurization of milk. I believe in quarantines when they are based on the best available information and are applied evenly. (A quarantine that isolates all travelers from West Africa who have symptoms of Ebola would be reasonable; one that singles out black Africans from anywhere on the continent regardless of health status would not.) In short, I am in favor of a coercive public health apparatus. The problem with the current measles outbreak is that enforcement has become too lax, with too many states allowing parents to opt out of immunizing their children because of ill-conceived beliefs that are incompatible with the public good.

Every parent spends a lifetime making choices about how to raise their child, from environment and lifestyle to moral and ethical guidance. But some choices have a greater capacity to impact the lives of others. If you want to let your child run around with scissors, watch R-rated movies, and eat nothing but pork rinds all day, you can. If you want to home-school your child because you want greater control over the curriculum he or she is being taught, you’re free to do that, too. And if you want to keep your child from getting vaccinated against communicable diseases, then the state won’t step in to force you. Opting out of vaccinations might not make you a bad parent any more than raising a fried-snack fiend might. But unless you’re planning to spend your days in physical isolation from every other human on the planet, it does make you a bad member of the public.

Your Beard Is Full of Tuberculosis

Victorian beard

On a crowded L train to Williamsburg one recent evening, I clasped my hand around the subway pole and scanned the multitude of hipster men surrounding me. As I studied the slim trousers ending just above sockless ankles, the plaid shirts encasing concave torsos, and the array of earnest tote bags, I spotted several men with full beards. Apparently these unfortunate hipsters were not aware that the style was on its way out (so 2013!), or maybe they were trying to get maximum benefit from their facial hair transplants. Certainly they were not East Asian, for whoever has met an East Asian man with the ability to grow a thick beard?

The embrace of facial hair by the hipster crowd has a historical precedent in the Victorian era, when full beards served as a symbol of masculinity and a stylistic corollary to the elaborate outfits and ornate home furnishings favored by fashionable contemporaries. Women clad themselves in long dresses with full skirts, bustles, and bodices, their hats topped with flowers, feathers, and, occasionally, entire stuffed birds. Men’s sartorial fashion was somewhat less extravagant, featuring neckties and waistcoats in rich fabrics like silk and brocade. At home, overstuffed sofas and armchairs, heavy drapes, and wall-to-wall carpets filled Victorian parlors. The preference for opulence even extended into bathrooms, which often contained luxuriant carpets and drapes, as well as ornamental wood cabinetry.

But in all of those folds of fabric and lush decorations lurked a hidden danger: germs. At the turn of the twentieth century, the leading causes of death were infectious and communicable diseases, especially tuberculosis, pneumonia, influenza, and diarrheal illnesses. Tuberculosis was particularly feared for the slow, painful death it induced in its victims; it consumed the body from the inside out, provoking a graveyard cough or “death rattle” in its final stages, when the patient’s gaunt appearance indicated that the end was near. In 1900, the disease killed one out of every ten Americans overall, and one in four young adults. Physicians had been able to diagnose tuberculosis accurately since 1882, when German bacteriologist Robert Koch identified the microorganism responsible for causing it. But this knowledge did nothing to improve a patient’s prognosis, for no cure existed. It wouldn’t be until after World War II, when antibiotics came into general use, that sufferers would finally have an effective remedy.

Koch’s discovery prefaced a new science of bacteriology. Toward the end of the nineteenth century, the lessons of the laboratory began to reach into American homes and public spaces, changing individual behaviors and cultural preferences. Spitting was a particular target of public health authorities. Tuberculosis-laden sputum could travel from the street into the home on women’s trailing skirts; once inside, it dried into deadly dust that imperiled vulnerable infants and children. In cities across the nation, concerned citizens urged women to shorten their hemlines to avoid dragging germs around on their clothing. “Don’t ever spit on any floor, be hopeful and cheerful, keep the window open,” read one pamphlet. The common communion cup, once a familiar sight in Protestant churches, disappeared as it became implicated in the spread of disease. Hoteliers instituted a practice of wrapping woolen blankets in extra-long sheets that were folded over on top; when a hotel guest departed, the sheet was laundered to remove any tubercular germs exhaled during slumber. Homeowners ripped out the overstuffed upholstery and heavy fabrics of their Victorian-era interiors, replacing them with metal and glass. In bathrooms, white porcelain tiles and the white china toilet supplanted carpeted walls and floors. Preferences shifted to materials that could be cleaned of dust and disinfected, slick surfaces where germs would be unable to gain a foothold.

The stripped-down, modern aesthetic extended to personal style, as well. Women’s hemlines grew shorter, their silhouettes more streamlined. Men began to shed their full beards and moustaches in favor of a clean-shaven look. In 1903, an editorial in Harper’s Weekly commented on the “passing of the beard,” noting that “the theory of science is that the beard is infected with the germs of tuberculosis.” Writing in the same magazine four years later, an observer remarked upon the “revolt against the whisker” that “has run like wild-fire over the land.” By the 1920s, the elaborate fashions of the Victorian era were nowhere in evidence. Picture, for instance, a flapper-era female wearing a cropped hairstyle and a calf-length shift. Or the neatly trimmed moustaches of Teddy Roosevelt and William Howard Taft, the last two presidents to sport facial hair in their official portraits.

A century ago, men with full beards would have felt cultural pressure to shave to protect themselves and their families from the dangerous germs concealed within. It’s a sign of how much our understanding of bacteriology has changed that today’s hipsters harbor no such worries; indeed, few are probably even aware of the historical precedent of disease-laden facial hair. I was never a fan of the look to begin with, and now I can’t help thinking back to earlier fears of contagion whenever I see these beards. But short of a tuberculosis epidemic, which of course I don’t wish for, I’ll have to hope for some other imperative that will bring about a contemporary “revolt against the whisker.”

 

Sources:

Nancy Tomes, The Gospel of Germs. Cambridge, MA: Harvard University Press, 1998.

HPV Testing and the History of the Pap Smear

Several weeks ago, the U.S. Food and Drug Administration approved the Cobas HPV test as a primary screening method for cervical cancer. As the first alternative to the familiar Pap smear ever to be green-lighted by the agency, this is big news. If gynecologists and other health practitioners adopt the FDA’s recommendations, it could change women’s experience of and relationship to cancer screening, a process we undergo throughout our adult lives. The HPV test probably won’t replace the Pap smear anytime soon, but it could pose a challenge to the diagnostic’s sixty-year standing as the undisputed first-line defense against cervical cancer.

The Cobas HPV test, manufactured by Roche, works by detecting fourteen high-risk strains of the human papilloma virus, including HPV 16 and HPV 18, the pair responsible for 70% of all cervical cancers. (The Centers for Disease Control estimates that 90% of cervical cancers are caused by a strain of HPV.) If a patient tests positive for HPV 16 or 18, the new FDA guidelines recommend a colposcopy to check for cervical cell abnormalities. If she tests positive for one of the other twelve high-risk HPV strains, the recommended follow-up is a Pap smear to determine the need for a colposcopy. But critics fear that the new guidelines will lead to overtesting and unnecessary procedures, especially in younger women, many of whom have HPV but will clear the virus on their own within a year or two. Biopsies and colposcopies are more invasive, painful, and expensive than Pap testing, and might increase the risk of problems with fertility and pre-term labor down the road.

Papanicolaou

When George Papanicolaou began the experiments in the 1920s that would lead to the development of his namesake test, cervical cancer was among the most widespread forms of the disease in women, and was by many accounts the commonest. It was also deadly. With no routine method to detect early-stage cancers, many patients weren’t diagnosed until the disease had already metastasized. Even for those who heeded the symptoms of irregular bleeding and discharge, medicine offered little by way of treatment or cure. As Joseph Colt Bloodgood, a prominent surgeon at Johns Hopkins Hospital, grimly observed in 1931, cervical cancer “is today predominantly a hopeless disease.”

Papanicolaou, a Greek-born zoologist and physician, spent his days studying the menstrual cycle of guinea pigs at Cornell University Hospital in New York City. Using a nasal speculum and a cotton swab, he extracted and examined cervical cells from the diminutive animals. Eventually he extended his work to “human females,” using his wife, Mary, as a research subject. He discovered that his technique allowed for the identification of abnormal, precancerous cells shed by the cervix. After a few false starts—his first presentation of his work was at a eugenics conference in 1928 and was panned by attendees—he went back to the lab, spending another decade on swabs and slides. By 1941 he had gotten his ducks in a row, and with a collaborator he published his results in a persuasive paper that was quickly embraced by colleagues. Thus was born the Pap smear.

The Pap smear is not an infallible diagnostic. It can’t distinguish between cells that will become invasive and those that will never spread outside the cervix. Results can be ambiguous and slides are sometimes misread. Nonetheless, the Pap smear was a breakthrough at the time because it detected precancerous changes in cervical cells. It upended the customary timeline of cervical cancer, pushing the clock back by enabling diagnosis of the disease at a stage when lesions could be treated with relative ease and success. Since its introduction, it has contributed to a remarkable reduction in American mortality from cervical cancer, from 44 per 100,000 in 1947 to 2.4 per 100,000 in 2010, a roughly eighteenfold decrease in just over sixty years.

When women in the U.S. die from cervical cancer today, it’s generally because they never had a Pap test, hadn’t had one within the past five years, or failed to follow up on abnormal results with appropriate treatment. The problem isn’t with the test itself; it’s with uneven access to screening and follow-up care. These are issues of class, geographic location, insurance status, and health literacy that the HPV test will do nothing to address. The Pap smear may not be perfect, but when utilized correctly it does a pretty good job of detecting cervical cancer. The FDA’s approval of the Cobas HPV test as a first-line defense and its new cervical cancer screening guidelines have the potential to subject millions of women to decades of invasive, expensive procedures, upending six decades of established practice for a protocol with no clear gains in effectiveness. And that is a very big deal.

 

Sources:

Siddhartha Mukherjee, The Emperor of All Maladies: A Biography of Cancer. New York: Scribner, 2010.

Ilana Löwy, A Woman’s Disease: The History of Cervical Cancer. New York: Oxford University Press, 2011.

Monica J. Casper and Adele E. Clarke, “Making the Pap Smear into the ‘Right Tool’ for the Job: Cervical Cancer Screening in the USA, circa 1940-95,” Social Studies of Science 28 (1998): 255-90.

Joseph Colt Bloodgood, “Responsibility of the Medical Profession for Cancer Education, with Special Reference to Cancer of the Cervix,” American Journal of Cancer 15 (1931): 1577-85.

Statistics on cervical cancer from the National Cancer Institute at http://seer.cancer.gov/statfacts/html/cervix.html.

Medicating Normalcy

adderrall

When I was in elementary school in the 1970s, I was friends with a boy who was considered hyperactive, which I vaguely understood to mean that he had excess energy and was therefore not supposed to eat sugar. He was occasionally disruptive in class and often had trouble focusing on group activities. My friend seemed to be constantly in motion, bouncing up from his chair during spelling tests and sprinting through the playground at recess, unable to keep still or remain quiet for any length of time. Another classmate, a girl, was a year older than the rest of us because she had been held back to repeat a grade for academic reasons. She was “slow,” a term we used at the time to refer to someone with a cognitive developmental disability.

If these two were growing up today, there’s a good chance they would be diagnosed with an attention disorder and medicated with a drug such as Adderall or Concerta. While A.D.H.D. has been around for awhile—it’s been listed in the Diagnostic and Statistical Manual of Mental Disorders in some form since at least 1968—its incidence in children has skyrocketed over the past few decades. As Alan Schwarz reported several months ago in the New York Times, the number of children taking medication for A.D.H.D. has increased from 600,000 in 1990 to 3.5 million today, while sales of stimulants prescribed for the condition rose more than fivefold in just one decade, from $1.7 billion in 2002 to nearly $9 billion in 2012. And researchers recently identified a new form of attention disorder in young people. Called “sluggish cognitive tempo,” it’s characterized by daydreaming, lethargy, and slow mental processing. It may affect as many as two million American children and could be treated by the same medications currently used for A.D.H.D.

This apparent epidemic of behavioral disorders in children highlights the convergence of a number of factors. In the late 1990s, changes in federal guidelines allowed the direct marketing of drugs to consumers, prompting increased awareness of disordered behaviors such as those which characterize A.D.H.D. Pharmaceutical companies routinely fund research into illnesses for which they manufacture drug therapies. As Schwarz (again) found, some of the chief supporters of sluggish cognitive tempo have financial ties to Eli Lilly; the company’s drug Strattera is one of the main medications prescribed for A.D.H.D. At the same time, overworked teachers in underfunded school districts lack the capacity to give special attention to rambunctious students, and instead urge parents to medicate them to reduce conflict in the classroom. Most important, the definition of what constitutes “normal” has narrowed. Thirty years ago, my unruly friend who wanted to run around during reading time and my absentminded classmate who forgot to write her name on her tests fell toward the extremes on the spectrum of normal behavior. Today they might be diagnosed with A.D.H.D. or sluggish cognitive tempo and given medication to make them less rowdy or more focused.

Normal childhood behavior these days means paying attention in class, answering all the questions on tests, turning in homework on time, and participating in classroom activities in a non-disruptive way. Children today, in short, are expected to be compliant. There will always be those who lack the ability to conform to an ever-constricting range of what constitutes normal behavior. For families with the access and the interest, pharmaceutical companies offer drugs designed to bring these young people within a threshold of what we consider acceptable.