Tag Archives: Medicine

Medicine, Technology, and the Ever-Changing Human Person

Though we often take for granted that humans are persons, they are not exempt from questions surrounding personhood. Indeed, what it means to be a person is largely an unsettled argument, even though we often speak of “people” and “persons.” Just as it’s important to ask if other beings might ever be persons, it is also important to ask if humans are ever not persons. In this pursuit, it’s crucial to separate the concept of personhood from notions of respect, love, and importance. That is to say, while a person may necessitate respect, love, and importance, something need not be a person to also demand respect, love, or importance.

When the concept of personhood in humans comes into discussion, it inevitably is punted to the medical community, often in the context of abortion and end of life. When does the heart first beat? When can a fetus feel pain? When does the brain begin/stop producing electrical activity? There is no doubt that advancements in our understanding of human physiology have enlightened discourse on what it means to be both a human and a person. However, the question of personhood is all too often debated solely in light of Western medical contexts. This conflation of physiology and personhood is the same issue that was discussed in my previous post on primate personhood and will be revisited in my next post on artificial intelligence. To escape this quandary we need to consider factors outside of physiology that are important to the concept of personhood, such as the social.

Continue reading

Food Allergies and Modern Life

20 years ago, I knew hardly anyone with a food allergy. Shellfish and strawberries were the only foods I’d ever heard of someone being allergic to. Then, suddenly, airlines were replacing peanuts with pretzels because of food allergies, and food started being labeled “Processed in a facility that also processes tree nuts.” A few years later, I met someone who was allergic to wheat. Pretty soon, it seemed like everyone I knew was allergic to something – gluten, lactose, chocolate, and a gazillion other things.

How can we explain this epidemic of food allergies? The radical shift from hunting and gathering finally catching up with us? Radical advances in medical technology that allow us to identify conditions that went unnoticed a generation ago? A build-up of environmental toxins in common foods? Interaction of foods with strange new food-like products like high fructose corn syrup and artificial flavors?

Or maybe we’re imagining the whole thing.

That’s the conclusion suggested by a recent study in the UK that found that only 2% of people who claimed to suffer from food allergies were actually allergic. The rest are suffering from something else, namely, the belief that they suffer from food allergies.

Now, I don’t know much about medicine and physiology, but I do know a thing or two about belief, and when millions of people believe something that isn’t empirically verifiable (1 in 5 Britons, according to the article above), we’ve got some ‘splaining to do.

Now, my first reaction is what I think many food allergy sufferers will share: that the study is flawed, not in its procedure, but in its very medical-ness. That is, there’s a strain of anti-modernism in the recent explosion of food allergy awareness that simply doesn’t trust the mainstream medical industry to  recognize and treat food allergies. So when you get a bunch of mainstream medical researchers to study the issue, it’s no surprise that they don’t find anything.

I doubt that’s true, but here’s the thing: the belief that it’s true is part and parcel of the food allergy… can I call it a “movement”? In their rejection of modern medical knowledge and modern food processing technologies, as well as their yearning for a more “natural” diet and a greater connection to their bodily functions, food allergy advocates (if not food allergy sufferers) certainly have at least some of the hallmarks of a social movement. And they’ve certainly created social change, as well – modern supermarket shelves are packed with (ironically) high-tech allergen-free foods: gluten-free beer, bread made of spelt, soy milk and ice cream, and so on.

But leave aside the political aspects of today’s food allergies; what intrigues me is the almost religious asceticism imposed by many food allergies. A vast number of foods are made containing wheat, for instance, so the wheat allergy sufferer is constrained to a diet that eliminates a great many common foods – much like a Jew during Passover, when most wheat-containing foods must be avoided as “leavened”.

The author of the Telegraph piece above notes the similarities between food allergies and food taboos, drawing on Mary Douglas’ understanding of the way boundaries create meaning and order:

[W]hat we eat not only defines us as people but also helps us to feel control and mastery over an otherwise chaotic and random world. She argued that by ordering foods into those we can consume and those that we can’t, we create meaning, and the boundaries provide order in our lives.

As a set of dietary restrictions, rather than a medical phenomenon, it seems reasonable to see food allergies – along with vegetarianism/veganism, the Slow Food movement, the “buy local” movement, and the $30 billion-plus diet market (in the US) – as an attempt to wrest back control over an aspect of our lives that we are increasingly and maybe irretrievable disconnected with. Few of us have any connection with the food cycle except as consumers at the end of a very long and complicated food production cycle. Food allergies allow us to assert control – on pain of death – over what we ingest, and demands an attentiveness – again, on pain of death – to what’s in the foods that we buy.

But this fussiness is part of a larger yearning for control altogether, which is where the anti-modernism comes in. Food has long been not only a means of forging and asserting cultural identity but of resisting the onslaught of a homogenizing, enervating modernity that threatens to dissolve not just cultural identities but individual identities. From the health spa/retreats of the Kellogg brothers and their peers (that gave us corn flakes and granola) to the popularity of Sweet-n-Low in the ‘50s and ‘60s to the communes of the hippie era to the herbal remedies of today, food has been seen as a way to “get back” to a more “natural” way of life – as opposed to the high-stress,  low-community, detached and distracted way of life that is modernity.

None of this is to suggest that there are not very real food allergies – it’s hard to argue with anaphylactic shock. Nor, more importantly, is it to say that the 98% of food allergy sufferers in the study with no medically detectable food allergies do not, in a very real way, suffer. The bodily manifestations of the most obviously social disorders can still drastically limit a person’s quality of life.

What it does suggest is that treatment of food allergies needs to go much further than antihistamines and food avoidance to encompass the cultural psychological. If control is a central issue – as it is already recognized to be in anorexia nervosa and other eating disorders, which strike bright, ambitious young women with overbearing parents hardest precisely because they are the least in control of their lives and the most aware of it – then a) developing non-food strategies for regaining control, and b) developing a realistic relationship with the demands and pressures of daily life are also important to individual adjustment.

On a social level, food allergies and other dietary restrictions join a range of other control-seeking phenomena – pop psychology, personal productivity, conspiracy theorism, and religious fundamentalism, all of which attempt to throw a lasso around the neck of our stampeding lives. As a critique of modernity, there’s nothing original here; Georg Simmel’s The Metropolis and Mental Life addressed similar concerns about the loss of autonomy in 1903, and Emile Durkheim addressed similar concerns a decade earlier, noting the anomie inherent in industrial/commercial society in The Division of Labor in Society.

But over a century of social critique has done little to alleviate the real suffering of real people. The question is, do we have the resources and will to take on these challenges at a social level today? Or are food allergies, in fact, an adequate collective response to dehumanizing social conditions? Do food allergies, like, say, spirit possession on Chinese factory floors, provide the relief people need to cope with the impacts of modernity, even as they suffer?

Pain in the Back

When I took time off from college to backpack around Asia, I heard at least a dozen versions of the following story:

the victim — we’ll call him “Bob” — was on a business trip alone somewhere in Europe, and went out to a bar one night to have a cocktail. Wouldn’t you know it, he woke up the next morning in an unfamiliar hotel room with severe pain in his lower back. He was taken to the emergency room, where doctors determined that, unbeknownst to himself, Bob had undergone major surgery the night before. One of his kidneys had been removed, cleanly and professionally.

That story is an urban legend, and it has been so widely reported and debunked, that FBI and hospital officials would not believe anthropologist Nancy Scheper-Hughes when she told them of real traffic in organs by organized crime back in 2002. Eventually she got support from law enforcement in Brazil and South Africa, and the FBI finally came around. The recent FBI bust in NY and NJ, which was initially reported as a “money laundering” operation, turns out to have also involved some the organ traffickers reported by Scheper-Hughes. Newsweek has an in-depth story, and Somatosphere explores some of the more anthropological aspects of the story.

Pandemic Anthropology

For those looking for a place to read more about the politics surrounding the swine flu pre-pandemic, Carlo Caduff, Lyle Fearnley, Andrew Lakoff, Stephen Collier and others at “Vital Systems Security” are madly, and intelligently, covering the unfolding events. Several posts in the last few days have addressed the issue of vaccine creation, the WHO and New York City public health surveillance of the disease. I also recommend Nick Shapiro’s posts on Bio-Agent Sentinels and Animal Biosecurity, which preceded the outbreak. All good stuff.

Viagra soup: a photo essay

In an earlier post, I wondered: Why are there a dozen local brands of sildenafil (the generic name for what’s in Viagra) available in Egyptian pharmacies, and only one brand of emergency contraceptive pill (ECP)? I’m not sure that I have a wholly convincing answer to this question, but I’ll lay out some parts of the puzzle. Jump in with a comment if you have other ideas.

Some Egyptian brands of sildenafil: Viagra, Virecta, Erec, Kemagra, Vigorama, Phragra, and Vigorex

Local brands of sildenafil available in Egypt, including: Viagra, Virecta, Erec, Kemagra, Vigorama, Vigoran, Phragra, and Vigorex. Photo by Lisa Wynn

First, Americans might think of erectile dysfunction drugs (EDDs) as somewhat shameful (think about mocking attitudes towards Bob Dole’s decision to do Viagra ads), but they have a more positive connotation in Egypt. Two reasons:

  1. As I’ve written elsewhere, in Egypt these drugs seem to be associated as much with the promise of exuberant, excessive sexuality rather than a shameful lack of erection. Maybe it would be more accurate to call them erection enhancement drugs rather than erectile dysfunction drugs. Continue reading

Why is there no official EC fatwa in Egypt?

Now in the last post on the topic, I mentioned that EC website that Princeton runs, http://ec.princeton.edu. There’s an NGO in Cambridge, MA called Ibis Reproductive Health that got a grant to make EC information and educational materials available in Arabic. A significant chunk of that grant was dedicated to creating an Arabic language version of the EC website. At Ibis, Angel Foster led this project and I took on the job of putting up the Arabic text that she created (with translator Aida Rouhana) online.

These days it’s not that hard to do websites in Arabic, but six years ago, it was a real puzzle. I couldn’t find any Arabic language plug-ins for DreamWeaver or FrontPage, so as I cut and pasted the Arabic text into the HTML programs, it wouldn’t display the Arabic properly, so it was really hard to do the links on specific words. The Arabic phrase for emergency contraception, which looks like this in Arabic:

منع الحمل الطارئ

looks like this in HTML code:

منع الحمل الطارئ

So I just had to muck around, highlighting different phrases, counting off letters or doing searches for strings of HTML code like that above, putting in links and then seeing where the links showed up in the Arabic texts, and then shifting the links around accordingly. It was a stupidly slow process. There was probably a better way to do it, but I wasn’t able to figure it out, so I slogged through the slow way.

Translation vs adaptation
I’m getting off the topic. Angel had decided that we couldn’t simply translate the existing website into Arabic. It had to be adapted to fit the social and cultural context of the Arabic speaking world and meet users’ needs. So, for example, she decided to include specific questions in the FAQs section on the interpretation and acceptability of EC in Orthodox Christianity and in Islamic jurisprudence. We hunted around for any fatwas on EC, both in published compendia of fatawa as well as in online databases, but we couldn’t find any. In fact, in the past 5 years, I have only found 1 fatwa on EC in an one of the many online fatwa databases.

That’s where interest in this Egypt research project came from. What did it mean that there were no fatwas on EC? Either it meant that EC wasn’t on anyone’s radar screen and was so totally unknown that nobody was asking about its status in Islam – hard to believe since there were dedicated products available in several Middle Eastern countries (including Yemen, Egypt, Tunisia, and Lebanon) – OR it meant that EC was just wholly uncontroversial and subsumed under jurisprudential discussions about pre-coital hormonal contraceptives. Continue reading

Why is emergency contraception interesting to think with?

[UPDATE: Formatting issues preventing this article from displaying properly have been fixed! – Ed.]

I promised that the next post would be about emergency contraception in Egypt, but I couldn’t resist first writing about EC more generally and describing debates about EC in the U.S.

From rape treatment to mainstream contraception

For more than four decades, medical researchers have known that there are methods you can use after sex to prevent – not terminate – pregnancy. Emergency contraception (EC) was first researched in the 1960s by physician-researchers trying to find a way to prevent pregnancies in survivors of sexual assault. They experimented in giving rape survivors high doses of regular oral contraceptive pills (OCPs). Later it was established that inserting a copper-bearing IUD after sex was even more effective at reducing pregnancy risk.

Remember that this was during the pre-Roe v. Wade era so there were political reasons for looking for a way of preventing pregnancy, rather than expecting to be able to resort to abortion, for women who got pregnant after sexual assault. But of course there are also enduring religious and public health reasons for wanting to find ways to prevent pregnancy, rather than end it with abortion.

Increasingly, knowledge about this contraceptive technique filtered out to a wider public and in the 1970s through the 1990s, there was an underground movement of women and doctors spreading the word about do-it-yourself emergency contraception. You just take several pills from a regular pack of birth control pills within 5 days after sex.

(There’s a website run by Princeton University’s Office of Population Research that tells you exactly how many pills to take depending on what brand of Pill you’ve got, and as far as I can tell, this website was actually the first health information website on the Internet.)

Even though this form of contraception has been known for decades, it’s only in the past ten years or so that emergency contraceptive pills (ECPs) have become more widely known and marketed as a contraceptive option for all women, not just rape survivors. There’s been a global movement to introduce “dedicated products” worldwide and to lobby for them to be made available without prescription. (A “dedicated product” is when emergency contraceptive pills are packaged and marketed specifically for that purpose. Activists have long argued that this is an important improvement on the DIY culture of cutting up packets of pills because it increases awareness of EC and lends the method popular legitimacy.)

Continue reading

New Reproductive Health Technologies in Egypt

Thanks to Kerim and Savage Minds for inviting me to contribute. I thought I’d write something about a new research project I’ve recently started on new and emerging reproductive health technologies in Egypt. This project looks at Egyptian interpretations of four technologies: emergency contraception, medication abortion, hymenoplasty, and erectile dysfunction drugs.

Some interesting paradoxes to contemplate:

  • Why are there at least a dozen local brands of sildenafil available from Egyptian pharmacies, and “Viagra sandwiches” or “Viagra soup” is on the menu at almost every restaurant that specializes in seafood, but there is only one brand of emergency contraceptive pill in Egypt, which is sold by an NGO because it’s not considered commercially viable enough for the mainstream pharmaceutical companies to bother with it?

The tap in the bathroom of the apartment where I stay when I’m doing research in Egypt. My roommate and I have often wondered where these came from. Was it a marketing campaign by Pfizer during the era when they weren’t allowed to engage in direct-to-consumer advertising for their product? Or did some sink manufacturer just think it would be cool to put Viagra on the handles?

Continue reading

Mmm… brains (and culture!)

Our friends at Culture Matters have spawned. Leave them alone and you never know what they’ll get up to. In this case, a new blog on “neuroanthropology.” This is the kind of think I really like to see, for a couple of reasons. One is that it is precisely the kind of place where there is room to move anthropology and biology forward together. As Greg puts it, it allows us to “think much more seriously about how culture might shape development, allowing us to think seriously about a kind of deep enculturation of the brain, senses, endocrine system, and the like. Researchers in fields that specialize in these topics are increasingly aware of the degree to which developmental variables affect developmental outcomes, creating opportunities for anthropological research to influence a host of other fields.” There is room for a new kind of medical and bio-cultural anthropology for people willing to connect— though it does depend on finding the brain scientists willing to meet the cultural scientists halfway, which is no mean feat.

The other thing i like about it is that it is a specialized scholarly blog; that’s something i’d really like to see more of because it gives me hope for the future of the field to see people openly and enthusiastically sharing ideas, research, new finds and new theories, rather than squirreling them away in the hopes of being first, and honor that seems increasingly less important.

Joy.
http://neuroanthropology.wordpress.com/

New Research on Death Rates of Overweight People

A report published today in the Journal of the American Medical Association and reported on by the NY Times adds weight to my “thin hypothesis” of well over a year ago: death rates for overweight people in 2004 were lower — 100,000 lower — than for “normal” people.

Linking, for the first time, causes of death to specific weights, they report that overweight people have a lower death rate because they are much less likely to die from a grab bag of diseases that includes Alzheimer’s and Parkinson’s, infections and lung disease. And that lower risk is not counteracted by increased risks of dying from any other disease, including cancer, diabetes or heart disease.

As a consequence, the group from the Centers for Disease Control and Prevention and the National Cancer Institute reports, there were more than 100,000 fewer deaths among the overweight in 2004, the most recent year for which data were available, than would have expected if those people had been of normal weight.

One expert, a Dr. JoAnn Manson from Brigham and Women’s Hospital in Boston, comments critically that “Health extends far beyond mortality rates… [The public needs to look at] the big picture in terms of health outcomes.” However, that’s what Health at Any Size advocates ave been advocating for year, rather than the simple-minded focus on BMI sorting people into “overweight” and “underweight” categories and automatically assuming these were “unhealthy” — and that the “normals” were “healthier”.

This new report gnaws at the seams of this construction, calling into question the meaning of normalcy and healthiness; although Dr. Manson and her “fat is bad” family are correct that some people experience quality of life issues (another huge construction), many don’t other than people — including doctors — pointing at them and yelling “fat bad, skinny good, you ugly and lazy and nasty”! Meanwhile, I think most people would rather not die this year, and would consider dying to be a sign of poor health (and something that also has some quality of life issues…).

Hacking Riffs on Rabinow

Ian Hacking has a very nice essay, that you can now download for free here, in the Fall 2006 issue of Daedalus. The essay sketches some recent trends in the new genetics, mostly taking its cue from Rabinow’s coinage of the term ‘biosociality.’

‘Biosocial’ is a new word, but its pedigree, although brief, is the best. Paul Rabinow, the anthropologist of the genome industry, wrote about ‘biosociality’ in 1992. He invented the word partly as a joke, to counter the sociobiology that had been fashionable for some time.

Hacking’s piece is an essay, and something of an exchange (Rabinow has put Hacking’s memorable phrase ‘representation and intervention’ to good use over the years) — so it doesn’t get bogged down in too many details. The main gist is that while sociobiology is out, the social fact of biology is in: reflexive genetic knowledge is more and more shaping the way that people imagine themselves and their relations. He touches on new developments in the science of ‘race,’ developments that my friend Duana Fullwiley calls ‘the molecularization of race.’ And he mentions Beck (‘risk’) and Fukuyama (‘transhumanism’) on the human future. The essay ends with a thought provoking vignette: Continue reading

J.I. Staley Prize Winner Announced

Charles L. Briggs and Clara Mantini-Briggs are the winners of this year’s J.I. Staley Prize, for their book Stories in the Time of Cholera: Racial Profiling During a Medical Nightmare.

The book recounts the 1992-1993 cholera outbreak that killed some 500 people, mostly indigenous, in eastern Venezuela’s Orinoco River Delta. The disease had been absent from Latin American for nearly a century. Cholera can kill healthy adults in as little as 12 hours and can make a 15-year-old appear geriatric, Briggs and Mantini-Briggs note in the book, but is prevented easily by the provision of uncontaminated food and water and is easily treated.

… The book draws from hundreds of interviews conducted from 1992-1999 with people from a cross-section of ages, occupations, social positions and degrees of bilingualism in the delta region, and the Venezuelan capital of Caracas. The authors recorded the stories of medical personnel, journalists, families of those killed by cholera, disease survivors, community leaders and government officials, traditional healers, missionaries, and others.

… In November 2006, [Charles] Briggs won the Edward Sapir Book Prize, the highest award in linguistic anthropology for co-authoring [with Richard Bauman], Voices of Modernity: Language Ideologies and the Politics of Inequality

Addressing Publics Positively: Some Developments in HIV Prevention

Serosorting Enjoy AZT

Earlier on Savage Minds, I asked about contemporary shifts in the symbolism and sociality of HIV/AIDS — a global epidemic. The question concerns me as someone who found himself along with other members of ACT UP, in the early mid-90s, in places like the parking lot of the Astrodome yelling at delegates to the Republican National Convention about funding for healthcare. It concerns me as someone who, in the late mid-90s, was employed as a professional ethnographer (!) tracking social knowledge related to sexual risk in San Francisco. These days, I am interested in the meaning of HIV and the ways in which that meaning is mediated and manifested specifically through what might be called technologies of public persuasion, whether they are relatively complex, such as social marketing campaigns (on the left above), or fairly simple, such as political protest posters (on the right).

A pointed exchange of sorts in the pages of Anthropology News last fall highlights the role that anthropologists are playing in ongoing efforts to respond to–and shape–the HIV/AIDS epidemic and its meaning today. An initial article (read: puff piece) lauded the research of Ted Green, who has worked closely with the Bush administration on its AIDS strategy. Green has embraced ‘risk elimination’ programs for HIV prevention — especially those that (according to Green) primarily prioritize abstinence and partner reduction over condom use and education. By his own account, this represents a paradigm shift in thinking on HIV prevention:

Green believes that the transformation of his maverick and unorthodox ideas into official US policy has been nothing short of groundbreaking.

The article works hard to place Green gingerly in between ‘fashionable academic anthropology’ and the conservative government he apparently works with quite closely, despite being a Democrat (we read of him on a private trip to Africa with CEOs of major pharmaceutical concerns and top Bush administration officials). Green sees Uganda’s famous ‘ABC’ approach as reflecting an ‘indigenous’ Ugandan response to AIDS, and apparently he emphasizes the need for HIV/AIDS agencies to take into account local perspective(s). His political party affiliation notwithstanding, Green’s research is embraced by the right wing of the political spectrum.

Douglas Feldman and Tom Boellstorff each published sharp letters in response to the AN piece. Continue reading

Identification Overload

Paltrow

This strikes me as a rather silly/heavy photo: Gwyneth Paltrow with face paint embracing the cause of AIDS in Africa. When I first encountered the “I AM AFRICAN” campaign, it was Brazilian supermodel Giselle making the statement. There is a news story about the campaign — designed to promote a charity that will pay for anti-retrovirals in poor countries — here. Yesterday, thumbing through GQ magazine, I saw Sting proclaiming “I AM AFRICAN,” with golden dust scattered across his face. It’s around.

Earlier on SM, I asked what kinds of persons and publics HIV (as a virus laden with meaning) summons. Here’s one: the sympathetic celebrity in ‘cross-cultural,’ possibly ‘cross-racial,’ drag. To me, this campaign echoes Kenneth Cole’s “We All Have AIDS” campaign for awareness. Some kind of shift has occured, we’re not in the 80s anymore, or even the 90s. People now identify with the virus, people who may or may not have it. Clearly the goal of these campaigns is to combat the stigma associated with HIV so that people might more readily get tested and seek treatment. These days, as one friend reported to me, people without HIV are even wearing “HIV+” t-shirts at international conferences. Is HIV fashionable? And what configuration of fashion/celebrity/global concern has yielded this image? What has made HIV safe for this sort of identification? What does it mean? A few more questions here: to what extent does this imagery hail a public that already has HIV in it? Is the infected public imagined to be elsewhere or is it imagined to be ‘here’? How are people with HIV addressed? To my mind these are important questions for thinking about HIV prevention because campaigns and images like this one exist in a field of messages that also includes calls to get tested and to use condoms, among other things.