Sunday, July 31, 2011

Try Hiding Veggies in Kids' Food to Improve Nutrition: Study

From Reuters Health Information

By Genevra Pittman
NEW YORK (Reuters Health) Jul 26 - When preschoolers reject broccoli, zucchini, and everything else green, parents can try hiding the veggies in the spaghetti, researchers suggest in a new report.
They found that youngsters got more of their daily greens when pureed veggies were secretly added to main dishes at breakfast, lunch, and dinner. And the kids didn't seem to notice that anything was different -- or to like the meals any less.
"We think of it as not deception, but recipe improvement," said coauthor Dr. Barbara Rolls from Pennsylvania State University in University Park.
"In this group of kids, we got most of them meeting their daily vegetable requirements -- that's pretty amazing," she told Reuters Health.
The study was done in daycare centers where kids were fed pre-made and measured meals, but Dr. Maureen Spill, lead author on the paper, said parents could easily doctor recipes at home. All they need is a blender.
And the technique could work for the whole family, including older kids, Dr. Rolls said. She previously found that adding pureed vegetables into adults' meals meant they ate more veggies and fewer total calories. Most of them couldn't taste the extra veggies, either. (See Reuters Health story of March 3, 2011.)
For the new study, the researchers fed prepared meals to 40 kids, age 3 to 5, one day a week for three weeks. The meals looked the same on each of the three days -- zucchini bread at breakfast, pasta with tomato sauce at lunch, and a chicken noodle casserole at dinner.
One day's worth of meals was prepared normally, with a typical veggie content in each entree. On the other two days, researchers added pureed cauliflower, broccoli, squash, zucchini and tomatoes to triple or quadruple the dose of vegetables.
After each meal, researchers weighed the food to determine how much kids ate. The preschoolers were also allowed to eat non-doctored side dishes and snacks during the day, including fruit, cheese, and crackers.
Compared to the day when they ate standard meals, tots almost doubled their total vegetable intake on the day when presented with high-veggie dishes.
And more hidden vegetables in the main dish didn't mean they ate any less fruit or vegetable side dishes.
Kids also ate about 140 fewer calories on days when their meals were most veggie-packed -- which may be a particularly important finding for overweight and obese kids, the researchers wrote online July 20th in the American Journal of Clinical Nutrition.
And the kids seemed to like the doctored recipes as well as the standard ones.
The researchers emphasized that recipe-doctoring should not be the only way to get kids to eat their veggies.
"I would urge parents to try to get vegetables into their kids' meals wherever they can," Dr. Rolls said. "This is an additional strategy that you put on top of exposing kids to real vegetables, eating the vegetables with the kids, (and) being persistent in exposing them" to vegetables.
She hopes more companies will start selling pre-pureed vegetables to make it easier for parents to add them into family meals.
Kraft has taken up a similar strategy by adding a blend of powdered cauliflower to some of its boxed macaroni and cheese, and other companies have promoted veggie-packed tomato sauces.
But the researchers encouraged parents to get involved in making the dishes themselves, and not just for their little kids. "Almost nobody is eating enough vegetables," Dr. Rolls said.

SOURCE: http://bit.ly/oXXIHz
Am J Clin Nutr 2011.

Thursday, July 28, 2011

Third of the World Infected With Hepatitis: WHO

From Reuters Health Information

GENEVA (Reuters) Jul 26 - Around one third of the global population, or 2 billion people, are infected with hepatitis, the World Health Organization said on Tuesday.
"This is a chronic disease across the whole world, but unfortunately there is very little awareness, even among health policy-makers, of its extent," WHO hepatitis specialist Steven Wiersma told a news conference.
The conference marked the first U.N. World Hepatitis Day, called by the world body to raise awareness of the viral disease.
Wiersma said the disease takes a "staggering toll" on health care systems around the globe and has the potential to spark epidemics, as well as being the main cause of liver cirrhosis and cancer.
A new WHO document says hepatitis B is the most common and could be transmitted by mothers to infants at birth or in early childhood as well as through contaminated injections or injected drug use.
Hepatitis E, transmitted through infected water or food, is a common cause of outbreaks of the disease in developing countries and is increasingly observed in developed economies, according to the WHO.
The WHO says effective vaccines have been developed against hepatitis A and B and could also be used against D. A vaccine for hepatitis E has been developed but is not widely available, while there is none for the C virus.
Vaccination campaigns had scored considerable success in many countries, with about 180 of the WHO's 193 member states now including the B vaccine in infant immunization programs, the agency said.
But more needs to be done to prevent or control the disease. It is vital to ensure that people already infected could receive quality care and treatment without delay, the WHO document declared.
 

Wednesday, July 27, 2011

Are Anaesthetics Toxic to the Brain?

From British Journal of Anaesthesia

A.E. Hudson; H.C. Hemmings Jr
Posted: 07/20/2011; Br J Anaesth. 2011;107(1):30-37. © 2011

Abstract and Introduction

Abstract

It has been assumed that anaesthetics have minimal or no persistent effects after emergence from anaesthesia. However, general anaesthetics act on multiple ion channels, receptors, and cell signalling systems in the central nervous system to produce anaesthesia, so it should come as no surprise that they also have non-anaesthetic actions that range from beneficial to detrimental.
Accumulating evidence is forcing the anaesthesia community to question the safety of general anaesthesia at the extremes of age.
Preclinical data suggest that inhaled anaesthetics can have profound and long-lasting effects during key neurodevelopmental periods in neonatal animals by increasing neuronal cell death (apoptosis) and reducing neurogenesis.
Clinical data remain conflicting on the significance of these laboratory data to the paediatric population.
At the opposite extreme in age, elderly patients are recognized to be at an increased risk of postoperative cognitive dysfunction (POCD) with a well-recognized decline in cognitive function after surgery.
The underlying mechanisms and the contribution of anaesthesia in particular to POCD remain unclear. Laboratory models suggest anaesthetic interactions with neurodegenerative mechanisms, such as those linked to the onset and progression of Alzheimer's disease, but their clinical relevance remains inconclusive. Prospective randomized clinical trials are underway to address the clinical significance of these findings, but there are major challenges in designing, executing, and interpreting such trials.
It is unlikely that definitive clinical studies absolving general anaesthetics of neurotoxicity will become available in the near future, requiring clinicians to use careful judgement when using these profound neurodepressants in vulnerable patients.

Introduction

General anaesthesia is a complex pharmacological response produced by a chemically heterogeneous class of drugs involving mechanisms that remain incompletely understood.
Current concepts define anaesthesia by its core features of amnesia, unconsciousness, and immobility (in the order of decreasing potency), each mediated by pharmacological effects on specific neuronal networks in different regions of the central nervous system. The molecular targets of these region- and dose-specific actions on neuronal network function have not been defined for most anaesthetics, although likely candidates have been identified and characterized.
This diversity of potential targets increases the probability of both positive and negative non-anaesthetic effects

While the actions of the i.v. anaesthetics can often be ascribed primarily to one or a few targets, the potent inhaled anaesthetics (ethers and alkanes) appear to be particularly promiscuous, interacting with many functionally important targets, both in the nervous system and in other organs.
As an example of the former, the anaesthetic effects of propofol and etomidate are mediated primarily though the potentiation of GABAA receptors as demonstrated in the resistance to immobility of a knock-in mouse harbouring a mutant receptor engineered to be insensitive to these drugs.
Analogous experiments have not been as conclusive for the inhaled anaesthetics, which are more than 100-fold less potent than i.v. anaesthetics and consequently are less selective in their target interactions.
Nevertheless, i.v. and inhaled anaesthetics share overlapping effects on many targets including GABAA and NMDA receptors. Actions on these two targets implicated in the desirable effects of anaesthetics, and other effects on unrelated targets, have come under renewed scrutiny for their potential roles in mediating potentially long-lasting detrimental effects on the developing and mature brain.
A defining feature of general anaesthetics is their ability to reversibly induce a coma-like state, but recent findings of changes in gene and protein expression persisting beyond emergence from anaesthesia provide a molecular basis for more durable effects.
This brief review highlights some of the critical laboratory findings that have called attention to the neurotoxic effects of anaesthetics, and efforts to establish the clinical significance of potential effects of anaesthetics on neurodevelopmental outcome.

Conclusions

Accumulating evidence from animal studies justifies recent concerns regarding the neurotoxic potential of anaesthetic drugs, particularly at the extremes of age.
In the young brain, neurodevelopmental factors predispose to anaesthetic excitotoxicity and effects on neurogenesis and synaptogenesis that can impair neurocognitive performance after early anaesthetic exposure. In the old brain, progressive neurodegenerative disease pathways can be exacerbated by anaesthetics in laboratory studies.
The clinical impact of these preclinical findings has not been established owing to difficulties in designing definitive studies and the significant delay between exposure and testing. Clearly further investigations, both experimental and epidemiological, are warranted to establish the clinical relevance and possible neuroprotective strategies for these untoward effects.
Alternatives to surgery and general anaesthesia are usually not available and pain itself can cause long-term neurodevelopmental deficits.
As current data do not support significant changes in practice other than avoiding purely elective procedures, anaesthesiologists should strive to minimize unnecessary exposure to general anaesthetic agents and other factors that might potentiate toxicity in susceptible patients.

Cell Phones and Cancer: Is There a Connection?

From Medscape Neurology

Bret Stetka, MD; Nora D. Volkow, MD

Editor's Note:
 
On May 31, 2011 the World Health Organization (WHO) announced their classification of radiofrequency electromagnetic fields emitted from cell phones as "possibly carcinogenic," and more recently published the evidence and rationale supporting their conclusion. Medscape recently spoke with National Institute on Drug Abuse Director (and BlackBerry® user) Nora D. Volkow, MD, about the implications of both the WHO statement and her own research showing that cell phone usage directly affects brain glucose metabolism.

Cell Phones and Cancer: Introduction

Medscape: Hello Dr. Volkow. What was your reaction to the WHO report concluding that electromagnetic fields from cell phones are "possibly carcinogenic?" Do you believe that the available data support this conclusion?
Dr. Volkow: I think that the report was justified on the basis of results that are inconsistent but which cannot be ignored. It seems prudent in this situation -- in which there are some results [linking cell phone use with malignancy] -- to be cautionary. I think that is why they came up with this recommendation.
It wasn't strong evidence, which the authors of The Lancet paper discuss. However, they couldn't just dismiss and ignore the findings.
Medscape: I found it interesting that the INTERPHONE study showed that in all exposure groups except that with the highest cell phone exposure, there was actually a reduced or equal incidence of glioma compared with those who'd never used a cell phone. What do you make of this finding?
Dr. Volkow: One could interpret this as implying that cell phone exposure at lower levels is actually protecting against glioma, whereas others would say that it means that long-term exposure is required to induce cancer. So you have both sides of the coin. It highlights how important it is to properly address this question -- to do a study that will be able to answer it definitively.
As The Lancet paper discussed, the effects that the researchers are looking for here may not be observable for 20 or 30 years. It could be similar to what was seen with cigarettes and cancer in which several decades of smoking behavior in patients were often necessary to uncover the linkage.
To summarize, there are studies that show [no association between cell phones and cancers] and there are some studies that do show an association.

Time Will Tell

Medscape: Cell phones weren't widely used until the last 10 or 15 years, so you're saying that it might just be too early to tell whether there's an association?
Dr. Volkow: It is clear that even though cell phones have now been out for the past 25 years, the rate of use then was limited to a few people and the amount of use was also limited. It wasn't until much more recently that their use became massively widespread.
In my view, by coming up with a conservative statement, the WHO is saying that we need to be observant and not become too complacent.

Cell Phones as Therapy?

Medscape: Can you give our readers a summary of your recent study linking cell phone use with altered glucose metabolism in the brain?
Dr. Volkow: Yes. We showed a correlation between exposure to electromagnetic radiation from a cell phone (the sound was muted) and increased brain glucose metabolism, which is a marker of brain function, in the areas of the brain closest to the antenna of the cell phone. So there is definitely a physiologic response in the brain to cell phone exposure, which we're attributing to electromagnetic radiation.
Medscape: Your data have shown no specific connection, and you've made no claims about a connection between this increased activity and malignancy. Correct?
Dr. Volkow: No, nothing. I wish! I would love for our data to illuminate this issue, but they don't; no matter how much imagination I use, there is no way of relating our finding to the issue of carcinogenicity. Our data just show that the brain is sensitive to the electromagnetic radiation emitted from cell phones.
Medscape: Your participants did not actually speak or listen on cell phones in order to control for potentially confounding brain activity, but could some of the increased activity have been caused by the brain anticipating and preparing for speech?
Dr. Volkow: Actually, in order to avoid that confounder, each participant had 2 cell phones -- one on each side of the head. Therefore, participants could not know where the signal was coming from. This was because expectations can profoundly affect brain processes. Indeed, that is at the basis of the placebo effect. We are very sensitive to the influence of expectation.
So is cell phone use harmful? We cannot say on the basis of our initial study. We need to look into whether there are long-lasting effects on the brain due to cell phone use and whether these have deleterious consequences. That is a question that remains unresolved.
If there are no negative effects, this could then be a very interesting technology to evaluate as a potential therapy, for example, in cases in which you need to rehabilitate an area of the brain and want to stimulate it.
Medscape: That's a good point because psychiatric and neurologic therapies such as transcranial magnetic stimulation work by applying an electromagnetic field to the brain.
Dr. Volkow: Yes, and this is a very accessible, low-cost technology. For me, the most important thing is to determine whether this type of stimulation is linked with any long-lasting negative consequences, and if it is not, evaluate the potential of this type of electromagnetic stimulation for therapeutic applications.

So What's Next?

Medscape: Do you and your colleagues have any follow-up studies planned?
Dr. Volkow: First, we want to replicate our finding and extend it to determine whether there is any evidence that there may be long-lasting effects. Although the ideal study would be to evaluate participants prospectively, that is very costly and we don't have the resources to do such a study; we will need to address it retrospectively on the basis of previous cell phone exposure.
We will control for exposure on the basis of cell phone records and behavioral questionnaires, but we also want to use different markers in brain imaging beyond just brain glucose metabolism. Among others we want to assess the effects of cell phone exposure on brain functional connectivity.
To do this we will obtain a functional MR map of functional connections before and after cell phone exposure to see whether the areas in which we are observing increases in metabolism are linked to changes in the way that the brain is transmitting information.

Should We Throw Out Our Cell Phones?

Medscape: On the basis of your findings and the WHO report, how do you think clinicians should approach this subject with patients? Should they cut back on cell phone usage -- or better yet, just throw them out?
Dr. Volkow: No, no! I haven't thrown mine out. That would be too deleterious to my life.
An important point that some people have made is that the most dangerous aspect of cell phones is using them while driving or in other inappropriate situations. The other day I found myself texting while walking across the street. I said to myself, "No, I should not be doing this." Regardless of what the data end up showing, this type of behavior is likely the greatest cause of mortality and morbidity related to cell phones.
Therefore, I would explain that although some studies reported an increased risk for malignancy associated with cell phones others did not, so we just don't know. I think it's important to bring knowledge to people in a way that is comprehensive and accurate and let them make the decisions about how they are going to use technology. Even if there is an association, it's not the end of the world because you can still use your cell phone -- just change the behavior by which you use them. Don't bring them close to your head, and use a headset or the speakerphone option -- anything that keeps the phone way from your head.
However, I would feel confident saying to parents in particular that they should educate their children to avoid using cell phones close to their ears. Due to children's size, the amount of deposition of energy is higher than that in an adult. The Lancet paper discusses this.
It also has greater effects on the bone marrow of children. Another aspect that they don't discuss but which is relevant is that the brains of children and adolescents are undergoing very fast developmental changes. Their brains are much more neuroplastic and susceptible to changes triggered by environmental stimuli.
Basically we don't have sufficient knowledge to know how cell phone signals affect brain processes. Why not play it safe? How you modify your behavior depends in part how you handle uncertainty: I am very bad at dealing with uncertainty in regard to potential effects to my brain so I chose to be conservative
Medscape: Has your cell phone usage changed in light of the current evidence?
Dr. Volkow: I haven't decreased the amount of time that I use my cell phone. However, I do try to avoid placing the cell phone to my ear whenever possible. I can't always put it on speakerphone, then everyone would hear my conversations!
 

Antibiotic Better Than Cranberries for UTI Prevention

From Medscape Medical News

Laurie Barclay, MD

July 26, 2011 — Trimethoprim-sulfamethoxazole (TMP-SMX) is more effective than cranberry capsules for prevention of recurrent urinary tract infection (UTI) in premenopausal women, according to the results of a double-blind, double-dummy noninferiority trial reported in the July 25 issue of the Archives of Internal Medicine.
"The increasing prevalence of uropathogens resistant to antimicrobial agents has stimulated interest in cranberries to prevent recurrent ...UTIs," write Mariƫlle A. J. Beerepoot, MD, from the Academic Medical Center in Amsterdam, the Netherlands, and colleagues.
"For premenopausal women with more than 2 UTIs per year, low-dose antibiotic prophylaxis is commonly recommended. However, this may lead to drug resistance not only of the causative microorganisms but also of the indigenous flora."
In this study, 221 premenopausal women with recurrent UTIs were randomly assigned to receive prophylaxis with TMP-SMX, 480 mg once daily, or cranberry capsules, 500 mg twice daily, for 12 months.
The main study outcomes were the mean number of symptomatic UTIs during the 12-month period of prophylaxis, the proportion of women who had 1 or more symptomatic UTIs, the median time to first UTI, and development of antibiotic resistance in indigenous Escherichia coli.
Compared with the TMP-SMX group, the cranberry group had a higher mean number of patients with at least 1 symptomatic UTI after 12 months (4.0 vs 1.8; P = .02) and a higher proportion of patients with at least 1 symptomatic UTI (78.2% vs 71.1%). In the cranberry group, median time to the first symptomatic UTI was 4 months, compared with 8 months in the TMP-SMX group.
TMP-SMX resistance after 1 month was present in 23.7% of fecal and 28.1% of asymptomatic bacteriuria E coli isolates in the cranberry group, compared with 86.3% and 90.5%, respectively, in the TMP-SMX group. Resistance rates for trimethoprim, amoxicillin, and ciprofloxacin in these E coli isolates after 1 month were also increased in the TMP-SMX group. When TMP-SMX was discontinued, resistance returned to baseline levels after 3 months.
In the cranberry group, antibiotic resistance did not increase. Participants tolerated cranberries and TMP-SMX equally well.
"In premenopausal women, TMP-SMX, 480 mg once daily, is more effective than cranberry capsules, 500 mg twice daily, to prevent recurrent UTIs, at the expense of emerging antibiotic resistance," the study authors write.
Limitations of this study include high withdrawal rates, lack of microbiological confirmation of all recurrent UTIs, inability to confirm that all women took the cranberry prophylaxis, and unclear optimal dosage of cranberries.
"From clinical practice and during the recruitment phase of this study, we learned that many women are afraid of contracting drug-resistant bacteria using long-term antibiotic prophylaxis and preferred either no or nonantibiotic prophylaxis," the study authors concluded.
"In those women, cranberry prophylaxis may be a useful alternative despite its lower effectiveness."
An invited commentary by Bill J. Gurley, PhD, from the University of Arkansas for Medical Sciences, Little Rock, notes that the comparison may not have been fair regarding dose and bioavailability of active ingredients.
"To date, few botanical dietary supplements have lived up to their claims as effective 'alternative' medicines, and until more is known about phytochemical disposition in humans, efficacy concerns will continue to plague these products," Dr. Gurley writes.
"Uncertainty regarding mechanisms of action and adequate dosing regimens underscore many of these concerns. It would appear, however, that cranberry has the potential to dispel some of this uncertainty."


Arch Intern Med. 2011;171:1270-1278, 1279-1280.

Monday, July 25, 2011

Unsafe Injection Practices: Outbreaks, Incidents, and Root Causes

From Medscape Education Infectious Diseases

David Pegues, MD; Karen Hoffman, RN, MS; Joseph Perz, DrPH, MA; Robin Stackhouse, MD
Another example of an outbreak resulting from direct syringe reuse occurred in a hospital-based pain clinic in Oklahoma in 2002. The staff prefilled syringes with fentanyl and propofol to treat multiple patients. It seemed like a good idea, but they reused the syringes through heparin locks, which they thought was the safety factor, into the intravenous lines. This breach resulted in a large number of infections: 71 cases of HCV infection and 31 cases of HBV infection.

It's not just the number of patients infected, but the hundreds, sometimes tens of thousands, of patients who require notification and who must live with uncertainty while awaiting their test results.
Imagine as a parent being notified that your child was potentially exposed to blood-borne viruses while receiving routine medical care.
It's the nature of hepatitis as well as HIV infections. There is an incubation period with the hepatitis viruses and HIV. If someone was exposed last week, it will take many months to rule out an infection. Healthcare workers go through this when they suffer a needle stick. This weighs heavily on patients who receive letters of notification about potential exposures.

Let's talk about contamination of a shared medication vial that occurs as a result of indirect syringe reuse. Indirect syringe reuse occurs when the same syringe used to administer medication to a patient is used to withdraw additional medications from vials for the same patient.
Those medication vials can become contaminated and serve as a source of infection to any patients who are administered medications from those vials. We also refer to this as "double dipping."

The outbreak in an outpatient endoscopy center in Nevada is a good example. This involved administration of a sedative in an outpatient setting. Anesthesia providers reused syringes to reenter vials of propofol. It's interesting that there was some perception of risk because these providers changed the needles on the syringes. A common misperception is that contamination is limited to the needle, representing a failure to appreciate that the needle and syringe are a contiguous unit and that the syringe can be contaminated in the process of injecting a patient with medication.
It's also important to note that contamination of the needle, and subsequently the syringe, can occur in the absence of obvious blood contamination. Another misperception is that if you can't see blood in the syringe, it doesn't contain a blood-borne pathogen.
Single-dose vials do not routinely contain a preservative or antiseptic, and although they're not intended for use in multiple patients, contamination does occur and contributes to the transmission of bacterial pathogens.

The fundamental take-home points about safe injection practices:-

  • First, it's important for providers to remember that needles and syringes are single-use devices. This isn't new, but it's something that we all need to understand and practice. Needles and syringes should not be used for more than 1 patient or be reused to draw up additional medication. Changing needles does not offer additional protection. If anything, it puts providers at risk.
  • Second, if we limit the sharing of medications, we can better achieve the double layer of protection. To that end, it's important to remind providers that we must not administer medications from a single-dose vial or bag of intravenous solution to multiple patients.
  • Third, we should attempt to limit the use of all shared medications, including vials that are approved and labeled by the US Food and Drug Administration as multidose vials. Ideally, we should use vials in the smallest quantity appropriate for a given clinical application.

CDC Issues Revised Guidelines for Postpartum Contraceptive Use

From Medscape Education Clinical Briefs

News Author: Laurie Barclay, MD
CME Author: Charles P. Vega, MD
07/12/2011;

Study Highlights


  • The CDC convened a group of 13 ad hoc reviewers to evaluate current recommendations for postpartum contraception from the WHO.
  • Ovulation can occur as early as 25 days postpartum among non–breast-feeding women, although fertile ovulation will not usually occur until at least 42 days after delivery.
  • Short interpregnancy interval is associated with a higher risk for low birth weight and preterm birth.
  • Competing with these issues is that the risk for VTE is elevated by 22-fold to 84-fold in women during the first 42 days of the postpartum period vs control participants. This risk is particularly high in the 3 weeks after delivery.
  • Women with other risk factors for VTE, such as advanced maternal age, smoking, or cesarean delivery, carry an even higher risk for postpartum VTE.
  • Nonetheless, there is no direct evidence examining the risk for VTE among postpartum women using combined hormonal contraceptives.
  • The current recommendations strongly discourage the use of combined oral contraceptives during the first 21 days after delivery. The risk for pregnancy is low during this period, but the risk for VTE is significantly elevated.
  • Combined hormonal contraceptives may be initiated between 21 and 42 days postpartum among women without other risk factors for VTE. However, among women with a risk factor, including age 35 years or older, previous VTE, immobility, transfusion after delivery, obesity, smoking, or postpartum hemorrhage, the use of combined hormonal contraceptives should be avoided until at least 42 days postpartum.
  • After 42 days postpartum, there are no restrictions on the use of combined hormonal contraceptives based on postpartum status.
  • Clinicians should also bear in mind that combined hormone contraceptives can interfere with successful breast-feeding.

Clinical Implications


  • Progestin-only hormonal contraceptives and the IUD may be initiated immediately after delivery, whereas women should not initiate contraception with a diaphragm or cervical cap until 6 weeks postpartum.
  • The current recommendations strongly discourage the initiation of combined hormonal contraceptives in the first 21 days postpartum among all women. Women without any risk factor for VTE may initiate combined hormonal contraceptives between 21 and 42 days postpartum, but women with a risk factor for VTE should wait until more than 42 days postpartum.

Friday, July 22, 2011

Vitamin E and Dementia: An Update

From Medscape Neurology > Viewpoints

Laurie Barclay, MD
Posted: 07/28/2010
Dietary Antioxidants and Long-term Risk of Dementia

Devore EE, Grodstein F, van Rooij FJ, et al
Arch Neurol. 2010;67:819-825

Summary

Higher dietary consumption of vitamins E and C was previously associated with lower risk for dementia and Alzheimer disease (AD), according to 6-year findings from the Rotterdam Study.
The goal of the present population-based, prospective study was to further examine the relationship between intake of major dietary antioxidants and long-term risk of dementia in the same Dutch cohort.
The study sample consisted of 5395 participants at least 55 years old who had no dementia and who provided dietary information at study baseline.
Mean duration of follow-up was 9.6 years.

Of 465 participants in whom dementia developed during follow-up, 365 were diagnosed with AD.
Higher baseline intake of vitamin E was associated with lower long-term risk of dementia (P=.02 for trend), after adjustment for age, education, apolipoprotein E4 genotype, total energy intake, alcohol intake, smoking, body mass index, and use of supplements.
Dementia was 25% less likely to develop in participants in the lowest tertile vs the highest tertile of vitamin E intake (adjusted hazard ratio, 0.75; 95% confidence interval [CI], 0.59-0.95).
After multivariate adjustment, dietary intake of vitamin C, beta carotene, and flavonoids were not linked to dementia risk. Findings were similar for AD risk.

Viewpoint

Compared with previous research, important contributions of this study are population-based estimates of incident dementia risk over a decade, evaluation of food-based antioxidants typically found in a Western-type diet, and assessment of various antioxidants and total vitamin E, including all 8 forms.

Despite study limitations of observational design with possible residual confounding, the findings suggest a modest reduction in long-term risk for dementia and AD associated with increased consumption of vitamin E-rich foods. Although an earlier study suggested a possible inverse association between vitamin C intake and dementia risk, this was not observed in this study.
A possible explanation is that intake of different antioxidants may affect risk at different time points in the course of dementia, supporting the need for additional research to address this question.

Abstract

Thursday, July 21, 2011

Early ART Reduces the Risk of Acquiring HIV Infection

July 19, 2011 (Rome, Italy) — Early initiation of antiretroviral therapy (ART) reduced rates of sexual transmission of HIV-1 and clinical events in couples in which one partner was HIV-1-positive and the other was HIV-1-negative, according to new findings from the HIV Prevention Trials Network (HPTN) 052 Study Team.
The findings coincide with reports from the TDF2 and Partners PrEP studies suggesting that ART can dramatically reduce transmission of HIV-1 from infected to uninfected partners.
Myron Cohen, MD, from the University of North Carolina at Chapel Hill, presented the findings for the HPTN 052 study here at the 6th International AIDS Society (IAS) Conference on HIV Pathogenesis, Treatment and Prevention.
Dr. Cohen's presentation was met with a rare standing ovation from audience members. The findings were also published online July 18 in The New England Journal of Medicine.

HPTN 052 researchers sought to compare early vs delayed antiretroviral therapy in patients with HIV-1 infection who had CD4 counts between 350 and 550 cells/mm3 and who were in a stable sexual relationship with a partner who was not infected.
A total of 1763 HIV-1 serodiscordant couples at 13 sites in 9 countries were included in the analysis. Just more than half of the participants were from Africa, and 50% of infected partners were men. Patients were randomly assigned to receive therapy either immediately or after CD4 cell counts had dropped to 250 cells/mm3 or less or HIV-1-related symptoms had occurred. Study drugs included lamivudine/zidovudine, efavirenz, atazanavir, nevirapine, tenofovir, lamivudine, zidovudine, didanosine, stavudine, lopinavir/ritonavir, ritonavir, and emtricitabine/tenofovir.

A total of 39 HIV-1 transmissions were observed during the study, indicating an incidence rate of 1.2/100 person-years (95% confidence interval [CI], 0.9 - 1.7). Of the 39 HIV-1 transmissions, 28 were virologically linked to the infected partner, indicating an incidence rate of 0.9/100 person-years (95% CI, 0.6 - 1.3). The remaining 11 infections were not linked to the study partner.
Of the linked transmissions, only 1 occurred in the early-therapy group compared with 27 in the late-therapy group (hazard ratio, 0.04; 95% CI, 0.01 - 0.27; P < .001), suggesting a strong benefit for the early initiation of antiretroviral therapy. In addition, participants receiving early therapy had fewer treatment endpoints (hazard ratio, 0.59; 95% CI, 0.40 - 0.88; P = .01).

"We now know that early initiation of antiretroviral therapy prevents the linked transmission of HIV," Dr. Cohen said in a summary session after his talk. "We know also that early antiretroviral therapy reduced the number of clinical events observed, and this further strengthens the argument for when discussion of therapy should be initiated."
According to Dr. Cohen, this study is definitive proof of the concept that early therapy can help prevent transmission. "We proved that, and we're pleased with that," he said.
Beatrice Grinsztejn, MD, from the Oswaldo Cruz Foundation, Instituto de Pesquisa ClĆ­nica Evandro Chagas, in Rio de Janeiro, Brazil, presented data on the trial's clinical outcomes at the same session, reporting a 41% reduction in HIV-1-related clinical events.
"ART not only prevented transmissions but also benefited the person taking the drugs," Dr. Grinsztejn told Medscape Medical News. According to Dr. Grinsztejn, the incidence rates of tuberculosis were almost double in the delayed-therapy group compared with in the immediate-treatment group (1.9/100 person-years vs 1.0/100 person-years).
She added that ART therapy was well-tolerated in this wide range of high-CD4 population, with low rates of serious lab abnormalities and adverse events. "No apparent difference was observed in the time to death in the delayed arm compared to the immediate arm," she said.
In a related editorial, also published online July 18 in The New England Journal of Medicine, Scott M. Hammer, MD, from the Division of Infectious Diseases, Columbia University Medical Center, New York–Presbyterian Hospital, New York City, notes that drugs to prevent HIV-1 transmission are being investigated in both infected and uninfected persons.
"In HIV-1–negative persons, drugs can be used before or after high-risk exposure (or both). The use of 1% tenofovir topical gel as a microbicide in women and of oral combination therapy with tenofovir and emtricitabine in men who have sex with men has reduced rates of HIV-1 acquisition by 39% and 44%, respectively, findings that have provided strong encouragement for these approaches," he writes.
Last week, results from 2 additional clinical trials of preexposure prophylaxis for HIV prevention among heterosexuals were presented at a press conference held by the US Centers for Disease Control and Prevention (CDC).

The CDC's Michael Thigpen, MD, principal investigator of the TDF2 study, conducted in partnership with the Botswana Ministry of Health, found that a once-daily tablet containing tenofovir/emtricitabine reduced the risk of acquiring HIV infection by 63% overall among heterosexual men and women. Results from the study were also presented in today's session at IAS.
Also at the CDC press conference, the University of Washington's Partners PrEP study found that 2 separate regimens, tenofovir and tenofovir/emtricitabine, significantly reduced HIV transmission (by 62% and 73%, respectively) among heterosexual couples in which one partner is infected with HIV and the other is not.


6th International AIDS Society (IAS) Conference on HIV Pathogenesis, Treatment and Prevention: Abstracts MOAX0102, MOAX0105. Presented July 18, 2011.
N Engl J Med. Published online July 18, 2011. Article full text, Editorial full text

Family History of Cancer Important in Screening Assessment

From Medscape Medical News

Pauline Anderson

July 19, 2011 — A family history of cancer, particularly breast and colorectal cancer, that may affect the need for prevention interventions increases with age and is particularly important in early adulthood, a new study suggests.
Cancer-related family history changes up to 3-fold between the ages of 30 and 50 years, according to the authors of the study. They recommend that, for purposes of determining cancer screening, clinicians update a patient's family history every 5 to 10 years.
"If a patient's family history is not updated during early and middle adulthood, the opportunity may be missed to intervene with earlier or more intensive screening that maximizes the likelihood of detecting cancer at an early, treatable stage," write the study authors, led by Argyrios Ziogas, PhD, from the University of California at Irvine.
The study was published in the July 13 issue of the Journal of the American Medical Association.
The aim of the study was to determine how often clinically important changes in cancer family history occur with time that would render a patient at increased risk and therefore a candidate for earlier or intensive screening.
To quantify how often significant changes in family history of breast, colorectal, or prostate cancer occur in adulthood, researchers used data from the Cancer Genetics Network (CGN), a US national registry of people with a personal or family history of cancer. They assessed changes in self-reported family history retrospectively (from birth to enrollment in the CGN) and prospectively (from enrollment to time of last follow-up.)
The analysis included 11,129 participants who were included in 1 or more of the retrospective cancer screening specific analyses; those included in the prospective analyses were a subset of this population.
 
Retrospective Analyses
For colorectal cancer, the analysis found that at age 30 years, 2.1% of participants would have met criteria for early colonoscopy screening. By age 50 years, this trend increased to 7.1% and peaked at approximately 11% at age 70 years. The 10-year rates for newly meeting high-risk screening criteria peaked at a rate of 3 additional people per 100 during ages 40 to 49 years.

For breast cancer, the study found a similar pattern with steady increases in the percentage of participants who would have met criteria for magnetic resonance imaging screening through early and middle age, from 7.2% of women at age 30 years to 11.4% at age 50 years. After age 60 years, the percentage leveled off at approximately 13%. The 10-year rates for meeting high-risk criteria for screening were fairly even from ages 20 to 50 years at 3, 2, and 3 additional persons per 100, respectively, during each respective decade.
Although prostate cancer analysis had similar findings of increasing family history until age 60 years, the overall percentage of men who would have met criteria for early prostate-specific antigen screening was much lower, at 0.9% of men at age 30 years, a rate that increased to only 2.0% by age 50 years. The 10-year rates for newly meeting high-risk screening criteria were also relatively low, remaining constant at 1 additional person who met high-risk screening criteria per 100 followed up for 10 years for men aged 20 to 70 years.
 
Prospective Analyses
The analysis for colorectal cancer found a rate of 1 additional person becoming eligible for enhanced screening per 100 participants followed up for 10 years. The age-specific results suggest that more family history changes occur during their 30s (10-year rate, 2 per 100) than their 40s (10-year rate, 1 per 100).
Analysis of breast magnetic resonance imaging showed that the overall rate of newly meeting criteria for more intensive screening was 3 additional women per 100 followed up for 10 years, with 10-year rates of 0 per 100 among women aged 35 to 39 years, 4 per 100 for women aged 40 to 49 years, and 3 per 100 for women aged 50 to 59 years.
For prostate cancer, the 10-year rate of newly meeting criteria for more intensive prostate-specific antigen screening was 7 per 100 for men younger than 30 years, 5 per 100 for men aged 30 to 39 years, and 3 per 100 for men aged 40 to 49 years. These results may not be in complete agreement with those of the retrospective analysis because of the limited data available, the study authors note.
 
Study Limitations
A limitation of the study was that it relied on reported changes in family history of cancer with time and did not consider an individual's personal medical history or prior cancer screening results, which may change during adulthood and need to be assessed by clinicians, the study authors state. The study also did not evaluate indications for genetic risk assessment and did not assess whether participants who met criteria for high-risk screening actually had this screening.
In an accompanying editorial, Louise S. Acheson, MD, MS, from Case Western Reserve University School of Medicine, Cleveland, Ohio, said that although a family history of cancer is an integral part of medical history and decision making, this information has not, until recently, been standardized or widely recorded in a structured fashion.
"To optimize the use of clinical time and resources, it is important to know when (at what ages and how often) to update the family history of common cancers," she wrote. "Knowing this, health information systems can be designed to accomplish this task. Furthermore, estimating the age-specific prevalence of increased familial risk is important for planning risk-appropriate cancer prevention services."
Having clinicians and patients participate in reviewing and recording family history will become more feasible as electronic health records compile information from all patient caregivers, writes Dr. Acheson.
 
Cost Considerations
However, Dr. Acheson said, increased levels of screening come with some risks and increased costs from, for example, more false-positive test results and test-associated complications. She pointed out that, based on American Cancer Society guidelines and estimates of the current study, approximately 7.2% of women 30 to 40 years old and 8.9% of those 40 to 50 years old would be candidates for both annual breast magnetic resonance imaging and screening mammograms, which cost roughly 10 times more than a mammogram alone.
"It's possible that if family cancer risk status was updated from ages 40-50 years, many lower risk women in that age group may forgo mammography screening," added Dr. Acheson. "The benefits, harms, and empirical results of such an approach are ripe for investigation."
 
The CGN is supported by the National Cancer Institute. The study authors and Dr. Acheson have disclosed no relevant financial relationships.
JAMA. 2011;306:172-178, 208-210.

An Overview of Conservative Treatment for Lower Back Pain

From International Journal of Clinical Rheumatology

Federico BalaguƩ; Jean Dudler

Abstract and Introduction

Abstract

This article summarizes the available evidence on the management of patients with subacute or chronic low back pain. The largest part is devoted to nonspecific low back pain but the models of spinal stenosis and disk herniation/sciatica are also specifically addressed. The authors point out the limited evidence available and the importance of a tailored approach for the individual patient. As the effect sizes of most therapies are rather small (close to that of a placebo), patients' preferences and other variables important for individualized management are highlighted. The task for the practitioner is difficult and awareness of this is important. Some speculation regarding potential future ways of improving patient care are presented.

Introduction

Low back pain (LBP) remains the most frequent musculoskeletal complaint worldwide and all age groups are affected by these symptoms. They are classically stratified into acute, subacute and chronic, with respective cut-offs of <6 weeks, 6–12 weeks and >12 weeks.
By itself, it produces direct and indirect costs of hundreds of billions of dollars for the US alone. Recent studies in adults and elderly populations have shown a significant increase in LBP, both in numbers and costs, in terms of investigations, treatments and disability, an observation at least partially explained by a raise in prevalence.
However, the large differences in the rate of spinal surgical procedures observed between states within the US, as well as between countries worldwide, suggest that decision-making is certainly influenced by regulations and other sociopolitical factors.
As LBP is extremely prevalent, the main problem remains the chronic cases, in particular in term of investigations and costs. Acute episodes of LBP statistically have quite a good prognosis more or less independently of the chosen treatment. A recent review confirms that a variety of treatments of acute LBP are effective and supported by the literature. Moreover, there are excellent updated reviews on the management of acute pain not limited to but including LBP. The interested reader can download this electronically.
If until recently a figure of 8–10% was usually accepted as the number of acute LBP episodes evolving into chronic cases, recent studies have show much more ominous figures with frequent relapses and persistence of symptoms at 1 year in up to 10–30% of cases according to definitions used. On the other hand, more than a third of the patients with LBP for more than 3 months do recover within 12 months.

Defining if a patient is going to become chronic or establishing an individual prognosis based on epidemiological studies is a very difficult task. Certainly, a precise diagnosis would help. However, it is commonly accepted that a specific identifiable etiology is only found in around 15% of cases, including disk herniations, spinal stenosis, osteoporotic fractures, inflammatory diseases and the infrequent (approximately 1%) specific neoplastic or infectious destructive lesions. The largest part of this manuscript is devoted to the 85% of patients asking for medical attention and suffering from chronic LBP without any of those specific identifiable etiologies, the so-called nonspecific (NS) LBP. Furthermore, we included spinal stenosis and lumbar disc herniation in the discussion in regard to their frequency in daily practice.
It has been shown already in adolescent populations that psychosocial factors are stronger predictors of incident LBP than mechanical factors. In adult populations, psychosocial factors are risk factors for chronicity much more strongly related to outcome than any clinical or mechanical variables, while previous episodes of pain are strong predictors of future ones. Twin's cohort studies have shown that NS-LBP is >40% genetically determined, whilst work, leisure time and physical activities play a minor role.
If the natural history of acute episodes of LBP is favorable independently of the chosen treatment, our daily concern remains chronic LBP and we have focused this review on the management of chronic and subacute LBP cases.
Finally, there is an overwhelming amount of literature on the subject, as well as numerous guidelines and recommendations. This short overview is based, for practical reasons, on the latest guideline we were aware of,  an English study with several major strengths, such as including among their criteria for implementation the likelihood of impact on patients' outcome and efficient use of NHS resources, completed with relevant randomized controlled trials (RCT) or meta-analysis published more recently.

Management of LBP

All of us, including patients, would prefer to prevent rather than to treat. Primary prevention would ideally prevent the occurrence of LBP, while secondary preventive measures are aimed at preventing the recurrence of acute LBP episodes with their risk of chronification, which is the most relevant problem.

Primary Prevention

During the last few decades it has been shown that adolescents report NS spine pains with a frequency close to that of their adult counterparts. These figures indirectly preclude any major efficacy of primary prevention techniques and suggest that any preventive measures should be implemented very early in an individual's life to have any chance to prevent the occurrence of LBP.
Nevertheless, numerous interventions have been tested over the years, and the evidence available for primary prevention of back problems have been recently reviewed.
Only exercise interventions, without any specificity of type, have shown effectiveness using the highest quality criteria, with an effect size (ES) ranging from 0.39 to >0.69 (ES computation: ES = [mean1 - mean2]/[pooled SD]; with ES interpretation: <0.15 = negligible effect; >0.15 and <0.40 = small effect; >0.40 and <0.75 = medium effect; and >0.75 = large effect). Other techniques such as stress management, shoe inserts, back supports, ergonomic/back education and reduced lifting programs have not been found to be effective.

Secondary Prevention

As stated before, an acute episode of LBP has an intrinsically good prognosis more or less independent of the chosen treatment. A variety of treatments are efficient for the acute episode, but the question is if any early intervention in this setting could or would prevent the ominous chronification and persistence of the problem in a significant percentage of patients. Up to 30% of cases will evolve badly, but there is no specific validated therapy of an isolated acute phase that would prevent this evolution and work in secondary prevention of chronic NS-LBP, except the physical exercises previously recommended as primary prevention.

Conservative Treatment of Chronic NS-LBP

As no preventive measure has sufficient power to prevent chronic NS-LBP, we are left with managing the problem when it arises, which can be done with an array of approaches, from conservative therapy to surgical intervention. In 2009, Rainville et al. reported on the evidence on conservative treatments for chronic LBP looking at the nonsurgical arm of several RCTs comparing surgical and conservative management. Clearly, surgery primarily focused on the alteration of structures perceived to be the sources of pain whilst conservative management aims to improve patients' function, with or without simultaneous improvement of pain.
The poor results obtained in terms of public health are not due to a lack of therapeutic possibilities. In an amazing paper, Haldemann reported that nonexhaustive research identified more than 200 treatments for LBP. In fact, while lack of treatments is not a problem, overtreatment could be a more worrisome problem, and opinion leaders have even suggested that clinicians back off.
Finally, lack of a really efficient and universal therapy remains a problem. A review of the magnitude of the effect of different treatments in acute and chronic LBP shows that the average effects of treatments for NS-LBP are not much greater than those of placebos. For example, NSAIDs and muscle relaxants reduce the intensity of pain by less than 20 points on a 100-point scale both for acute and chronic LBP patients. The very few therapies that have demonstrated larger effect sizes (>30 on a 0–100 pain scale) have only been evaluated in single small studies, and not been reproduced in any larger cohort.
The overwhelming number of available guidelines and recommendations reflects the difficulty of managing a common problem in the absence of any universal efficient treatment. We chose to highlight and comment on the latest UK guidelines for NS-LBP between 6 weeks and 12 months duration, which have the advantage of not only summarizing the main recommendations for patients in seven headings (reused as subheadings below), but also to clearly define therapeutic modalities that should not be prescribed despite the urge to be proactive in front of a suffering patient.

Information, Education & Patient Preferences

Promoting self-management and encouraging physically activity certainly make sense and should be reasonably cheap. The question is how much resource should be invested in this direction. The limits of education and self-management have recently been highlighted for osteoarthritis, and the same caveat certainly applies for LBP.
Along the same lines, offering booklets and stand-alone formal education programs could appear appealing, as they are widely available. However, there is no scientific evidence, and it seems essential to take into account the person's expectations and preferences before using such programs.

Physical Activity & Exercise

Again, advising people with LBP to stay physically active is likely to be beneficial, and advising to exercise is adequate. Nevertheless, advising is only part of the problem. There are often a lot of concerns from practitioners to know which type of exercise program should be ideally prescribed, but most types of exercise will be appropriate, including aerobic activity, movement instruction, muscle strengthening, postural control or stretching. The real trick is actually to motivate the patient to exercise, and a structured group exercise program is the recommended first step. A one-to-one supervised exercise program may be offered if a group program appears unsuitable for a particular person, and is certainly more adequate than leaving the patient to exercise on their own, regardless of their good resolutions.
Van Midelkoop et al. have recently summarized the evidence for exercises. "Exercise therapy seems to be effective for the prevention of LBP, but only a few recent trials have been conducted. This therapy is not effective for acute LBP, whereas it is effective for chronic LBP; however, there is no evidence that any type of exercise is clearly more effective than others. Subgroups of patients with LBP might respond differently to various types of exercise therapy, but it is still unclear which patients benefit most from what type of exercise. Adherence to exercise prescription is usually poor, so supervision by a therapist is recommended. If home exercises are prescribed, strategies to improve adherence should be used. Patient's preferences and expectations should be considered when deciding which type of exercise to choose". In other words, one can prescribe exercise therapy without the fear of not being a specialist as there is no clear cut benefits for one type to the other, or rather we are still unable to precise who is going to benefit from exercises. Again, the most important and hardest part is getting the patients' adherence to the program; matching the patients' expectations and preferences should help.

Manual Therapy

The UK guideline proposes to consider offering a course of manual therapy, including spinal manipulation. However, not all patients feel comfortable with this type of approach, and, again, patient's expectation and preferences clearly dictate the use of such therapy.

Other Nonpharmacological Therapies

We can only agree with the guideline's authors in their strong recommendation not to offer any of the multiple therapies with no scientific support, including laser therapy, interferential therapy, therapeutic ultrasound, transcutaneous electrical nerve stimulation (TENS), traction or lumbar supports.

Invasive Procedures

Injections and denervations are other fashionable procedures that have gained large acceptance in some countries. However, a systematic review on injection therapy and denervation procedures for chronic LBP has recently concluded that the evidence supporting these two categories of therapies over placebo is "low to very low quality". The authors highlight that it cannot be ruled out that in carefully selected patients some injection therapy or denervation procedures may be of some benefit; however, it remains equally false to push those procedures for the majority of patients and the British guidelines recommend not to offer injections of therapeutic substances into the back for NS-LBP.
Acupuncture is a special case that could be considered for a limited number of sessions, and is probably more beneficial if it matches the patient preferences.

Combined Physical & Psychological Treatment Program

Combined physical and psychological treatments (including cognitive behavioral approach and exercise) have been shown to be efficient. However, to demonstrate benefits they must be quite substantial, comprising around 100 h over a maximum of 8 weeks. Availability of such programs and of the patient are limiting factors, but cost issues remain the main limitation and such programs should be reserved for patients with high disability and/or significant psychological distress and who have failed at least one less intensive treatment program.
Group cognitive behavioral treatment has been shown to have a statistically significant effect (over 1 year) at much lower cost on troublesome subacute and chronic LBP in primary care, with effect sizes ranging from 0.1 for SF-12 mental to 0.5 for SF-12 physical and fear-avoidance beliefs. However, the benefits appear limited and are also clearly dependent on local availability of such programs.

Pharmacological Therapies

As in all pain-related guidelines, regular paracetamol is the first recommended medication option. However, paracetamol is not free of side effects when taken regularly at a recommended dose. NSAIDs and/or weak opioids are the next step, again despite the fact that their benefits are far from being established.
NSAIDs are also far from being side effect free, particularly in the elderly. It is important to take into account the individual risk, and in particular the gastrointestinal risk, and either a standard NSAID coprescribed with a proton pump inhibitor (PPI) or a COX-2 inhibitor is recommended. Again, the patient's profile, preferences and expectations should not be forgotten. Aspirin cancels the benefits of COX-2 inhibitors, while more than 25% of the patients never start their PPI cotherapy. In other words, we often take considerable risk for a therapy with limited evidence for efficacy.
If ineffective, recommendations consider offering tricyclic antidepressants for pain relief. However, these are not more efficient than the other analgesics discussed above. Selective serotonin reuptake inhibitors (SSRIs) are usually not proposed for treating pain, but a recent RCT on the efficacy of duloxetine in patients with non-neuropathic chronic LBP has shown a significant reduction in pain and improved function compared with placebo.
Finally, one can consider offering strong opioids for short-term use to people in severe pain. Referral for specialist assessment may be required for prolonged use of strong opioids given the risk of opioid dependency and side effects. There is also increasing concern about the utilization of opioids for chronic noncancer pain management.The adverse effects  and the utilization of these drugs in rheumatology have recently been reviewed. A recent Cochrane review, including, among others, seven studies on LBP patients, highlights the limits of the tolerance and efficacy of these drugs. While opioids are an alternative, they are not magical pills that will solve the problem of pain management in LBP.
If no treatment is universally and totally efficient, it certainly appears rational to combine different interventions, a commonly used practice for some LBP healthcare providers. A recent Cochrane review on combined chiropractic interventions reported that combined interventions slightly improved pain and disability in the short term and pain in the medium term, but only for acute and subacute LBP. No difference was demonstrated for chronic LBP and for studies including a mixed population of LBP. Even if combining several treatments improves the results, that approach is not always cost effective, as recently shown by Smeets et al. 
There is an urge to be proactive and we often use and abuse unproven therapeutics. However, we should at least base our decisions to continue such treatments on the individual response.

Conclusion

The societal burden of LBP keeps increasing despite, or perhaps because of, the ever increasing number of diagnostic and therapeutic procedures performed for this very common ailment. Happily, the natural history of acute episodes of LBP remains favorable in most cases, independently of the chosen treatment. Subacute and chronic cases represent the real challenge and our daily concern.
We are still unable to adequately identify the patients at high risk of becoming chronic, nor has any universal measure been demonstrated useful for primary or secondary prevention. Furthermore, overtreating patients with NS-LBP is probably more deleterious than beneficial and we should probably restrain from being overenthusiastic at using one of the hundreds of treatments described for the management of NS-LBP at the first sign of LBP.
Finally, the risk of potential side effects should also be weighted in the balance, as well as the individual patient's preferences taken into account, before starting any therapy. We should ensure that we have identified the reasons why the patient is sitting in front of us, bearing in mind that among individuals reporting LBP, "consulters" and "nonconsulters" cannot be distinguished in terms of pain intensity.We still misunderstand too often the motivation and/or expectations of the individual patient, a problem coupled with the limited knowledge of the psychological profile, patient preferences, CNS participation, and so on, based on the meager time available for a clinical appointment.
There is limited evidence for a majority of treatments in chronic LBP, and effect sizes are usually moderate for the few statistically significantly effective forms of treatment. We are also faced with difficulties in interpreting the evidence, as review articles may end up with significantly different conclusions based on the same literature,  and the difficulties in using evidence in clinical practice have been recently highlighted. However, the individual response cannot always be inferred from the limited evidence available, and patients should still be managed despite the absence of universally efficient treatment. We apply the same treatments with their limited evidence and small effect sizes to all chronic LBP patients. More precise diagnosis and subgrouping of NS-LBP for the purposes of treatment might improve the efficacy of therapies; however, a recent review of the topic has concluded: "At this point, the bulk of research evidence in defining subgroups of patients with LBP is in the hypothesis generation stage; no classification system is supported by sufficient evidence to recommend implementation into clinical practice". Spinal stenosis and disk herniation with sciatica are good examples that our subgrouping is still too vague to be really useful.
We should promote exercise and self-management programs for osteoarthritis and back pain. Despite weak evidence for chronic back pain, exercise programs appear to represent the best way forward, and there is also moderate-quality evidence that post-treatment exercise programs can prevent recurrences of back pain.
While patients' self-management and the promotion and encouragement of the maintenance of daily physical activities can, and certainly should, be encouraged in all patients at no risk and no cost, there are a multiplicity of treatments where the risk/cost– benefit ratio is not so clear.
Even simple measures such as the prescription of analgesics or NSAIDS should be monitored by means of validated tools in order to evaluate the outcome. In the absence of clear and established benefits for any therapy, it is essential that any prescribed treatment is evaluated and monitored at the individual level. More difficult with the urge to be proactive in front of a suffering patient, it is mandatory, particularly in a time of limited healthcare resources, to refrain from using all those therapies and procedures where clear lack of benefits has been demonstrated.
It is possible that some of those therapies remain valid for some individual patients or well-defined subgroups of LBP. However, so far we have been unable to identify and characterize such subgroup well enough to be applicable at the individual level. The concept of personalized and individualized healthcare should not be used to promote the use of inadequate therapies, whose evaluation in such settings should be clearly limited to well-designed trials.
More than anything, we should try to demedicalize LBP and promote self-management as much as possible. Promoting exercises with methods that do not require any contacts with healthcare providers, like walking, may be effective for the treatment of LBP (low-to-moderate evidence in a recent review), and as recently writen by Weiner and Nordin, "a large proportion of patients seeking care can manage their short term and even longer term incapacity".
It has been shown that acceptance of pain is significantly associated with quality of life.[62] We still do not know to what extend this variable can be influenced by the healthcare providers, but it is all too easy to lure patients into hopes that specific diagnosis and miracle treatments are available.

New Infection Prevention Guidance for Outpatient Settings

Melissa Schaefer, MD

From CDC Expert Commentary

Hello, I'm Dr. Melissa Schaefer, medical officer in the Division of Healthcare Quality Promotion at the Centers for Disease Control and Prevention. I'm pleased to speak with you today as part of the CDC Expert Video Commentary Series on Medscape about a new Infection Prevention Guide for Outpatient Settings just released by the CDC.
Over the last several decades, we have focused our attention on preventing infections in acute care settings and have seen tremendous successes. However, as healthcare continues to transition to settings outside the hospital, we need to extend our efforts and successes to all settings where patients are receiving care. CDC's new Infection Prevention Guide for Outpatient Settings distills existing evidence-based recommendations from CDC and the Healthcare Infection Control Practices Advisory Committee. The recommendations included in this guide represent the minimum expectations for safe care, both for patients and healthcare personnel, in every outpatient setting.
As a clinician myself, I have helped investigate numerous outbreaks in outpatient settings and am concerned that these likely represent the tip of the iceberg with respect to bad practices happening in our healthcare system. For example, as part of an outbreak investigation at an endoscopy clinic in Nevada, providers were noted to reuse syringes to enter vials of propofol to obtain additional doses during a procedure. This practice contaminated the contents of the vials and those single-dose vials, which should have been discarded at the end of the procedures, were then used for subsequent patients. This unsafe practice led to transmission of hepatitis C virus to at least 7 patients on 2 separate days and more than 40,000 patients were notified that they may have been exposed to a blood-borne virus. Events like this continue to surface and cause us great concern at CDC. No patient should be subjected to these kinds of risks, which are completely preventable through adherence to standard precautions.
Although the recommendations contained in this guidance are not new, it provides a clear, concise, evidence-based resource designed specifically for infection prevention in outpatient settings. This new, 16 page document outlines the key policies, procedures, and practices that outpatient settings should have in place in order to deliver safe care.
Recommendations highlighted in the guidance focus on key components of standard precautions, including the following:
  1. Good hand hygiene, including use of alcohol-based hand rubs and hand washing with soap and water, is critical to reduce the risk of spreading infections in outpatient settings.
  2. Safe injection practices must always be followed. Healthcare personnel should use aseptic techniques when preparing and administering medications, and syringes and needles should not be reused either from patient to patient or to reenter medication vials.
  3. Establish and follow procedures for the safe handling of potentially contaminated medical equipment. Reusable medical equipment should be cleaned and reprocessed appropriately prior to use on another patient, and equipment that is labeled for "single patient use" should be appropriately discarded after use.
Healthcare should provide no avenue for the transmission of potentially life-threatening infections. As healthcare professionals, we must recognize our responsibility to implement safe care practices. By working together, we can ensure that safe practices are understood and followed by all. We urge you to use this guidance document to assess the practices in your facility to ensure that patients are receiving the safe care that they expect and deserve.
To review the complete Guide to Infection Prevention for Outpatient Settings: Minimum Expectations for Safe Care follow the link or see the resources on this page.

Thursday, July 14, 2011

New Guidelines Issued for Insomnia and Other Sleep Disorders

From Medscape Medical News
Laurie Barclay, MD

September 2, 2010 — The British Association for Psychopharmacology (BAP) has issued a consensus statement on evidence-based treatment of insomnia, parasomnias, and circadian rhythm disorders. The new recommendations, intended to guide psychiatrists and physicians caring for those with sleep problems, are published online September 2 in the Journal of Psychopharmacology.

"Sleep disorders are common in the general population and even more so in clinical practice, yet are relatively poorly understood by doctors and other health care practitioners," write Sue J. Wilson, from the Psychopharmacology Unit, University of Bristol, Bristol, United Kingdom, and colleagues.
"These ...BAP guidelines are designed to address this problem by providing an accessible yet up-to-date and evidence-based outline of the major issues, especially those relating to reliable diagnosis and appropriate treatment.
We limited ourselves to discussion of sleep problems that are not regarded as being secondary to respiratory problems (e.g. sleep apnoea – see NICE Guidance TA139), as these fall outside the remit of the BAP."

These guidelines also do not cover neuropsychiatric disorders, such as narcolepsy and restless legs, for which recent sets of guidelines already exist. The new recommendations were developed after a consensus meeting in London in May 2009 of BAP members, as well as clinicians, experts, and advocates in sleep disorders, based on literature reviews and a description of standard of evidence.

Recommendations for Diagnosis and Treatment

Specific evidence-based recommendations for diagnosis and treatment of insomnia and other sleep disorders, and their accompanying level of evidence rating, are as follows:

The diagnosis of insomnia is primarily based on complaints provided in the clinical interview by the patient, family, and/or caregiver, ideally corroborated by a patient diary.
Referral to a specialist sleep center may be indicated for other tests in some cases, such as actigraphy for differential diagnosis of circadian rhythm disorder, polysomnography for suspected parasomnia or other primary sleep disorder, or in the case of treatment failure.
Insomnia should be treated because it impairs quality of life and many areas of functioning and is associated with an increased risk for depression, anxiety, and possibly cardiovascular disorders. Treatment goals are to reduce distress and to improve daytime function. Choice of treatment modality is based on the particular pattern of problem, such as sleep-onset insomnia or sleep maintenance, as well as on the evidence supporting use of specific treatments.
For chronic insomnia, cognitive behavioral therapy (CBT)-based treatment packages are effective and should be offered to patients as a first-line treatment. CBT, which may include sleep restriction and stimulus control, should be made available in more settings.
When prescribing hypnotic drug treatment, clinicians need to consider efficacy, safety, and duration of action. Other issues to consider may include previous efficacy or adverse effects of the drug and history of substance abuse or dependence.
Recommendations for long-term hypnotic drug treatment are to use it as clinically indicated. To discontinue long-term hypnotic drug therapy, intermittent use should first be attempted if feasible. Depending on ongoing life circumstances and patient consent, discontinuation should be attempted every 3 to 6 months or at regular intervals. During taper of long-term hypnotic drug treatment, CBT improves outcome.
When using antidepressants, clinicians should apply their knowledge of pharmacology. When there is a comorbid mood disorder, antidepressants should be used at therapeutic dose. However, clinicians should beware that overdose of tricyclic antidepressants can be toxic even when low-unit doses are prescribed.
Because of frequent adverse effects of antipsychotic drugs, as well as a few reports of abuse, there is no indication for use as first-line treatment of insomnia or other sleep disorders.
Antihistamines have a limited role in psychiatric and primary care practice for the management of insomnia.

Recommendations for Certain Populations

Specific evidence-based recommendations for management of insomnia and other sleep disorders in special populations and conditions are as follows:

After menopause, the incidence of sleep-disordered breathing increases, and the clinical presentation is different in women vs men and often includes insomnia. Informed, individualized treatment of symptoms is needed for use of hormone therapy, considering risks and benefits clarified in recent studies.

Behavioral strategies are recommended for children with disturbed sleep
.
In children with attention-deficit/hyperactive disorder not treated with stimulant drugs, melatonin administration may help advance sleep onset to normal values.
For children and adults with learning disabilities, clinical evaluation should describe the sleep disturbance and triggering and exacerbating factors

Recommended first-line therapy includes environmental, behavioral, and educational strategies.
Melatonin is effective in improving sleep. The treatment plan should be based on a capacity/best-interests framework.
For management of circadian rhythm disorders, clinical evaluation is essential in delayed sleep-phase syndrome and free-running disorder.
In delayed sleep-phase syndrome, free-running disorder, and jet lag, melatonin may be useful, but other strategies such as behavioral regimens and scheduled light exposure (in sighted individuals) can also be used .

J Psychopharmacol. Published online September 2, 2010.

FDA Strengthens Warnings on Sleep Drugs

From Medscape Medical News > Medscape Alerts

Yael Waknine
Posted: 03/15/2007

March 15, 2007 — Manufacturers of 13 sedative-hypnotic drugs have been asked to strengthen safety label warnings regarding adverse events associated with their use, the US Food and Drug Administration (FDA) announced yesterday. The drugs are used in the treatment of insomnia.

The request, issued in December 2006, addresses the risk for engaging in activities while somnolent with no memory of having taken a pill, according to an alert sent Wednesday from MedWatch, the FDA's safety information and adverse event reporting program. Requested label changes also emphasize the risks for anaphylaxis and angioedema, which can occur the first time the drug is taken.

The FDA notes that sleep behaviors while taking these drugs can be complex and may include sleep-driving, making phone calls, and preparing/eating meals. Because of potential variations in risk levels, makers have also been advised to conduct clinical investigations to assess event incidence rates for each individual product.

Along with these label changes, the FDA also asked manufacturers to develop patient medication guides for distribution with each prescription. These easy-to-read pamphlets are intended to inform patients of the risks associated with therapy and that appropriate precautions that should be taken, such as avoiding alcohol and contacting their physician before discontinuing therapy.

Affected products include zolpidem tartrate tablets and extended-release tablets ( Ambien and Ambien CR , Sanofi-Aventis); butisol sodium (Medpointe Pharm HLC); pentobarbital/carbromal ( Carbitral , Parke-Davis); flurazepam HCl capsules ( Dalmane , Valeant Pharm); quazepam tablets ( Doral , Questcor Pharms); and triazolam tablets ( Halcion , Pfizer).

Also included are eszopiclone tablets ( Lunesta , Sepracor); ethchlorvynol capsules ( Placidyl , Abbott); estazolam ( Prosom , Abbott); temazepam capsules ( Restoril , Tyco Healthcare); ramelteon tablets ( Rozerem , Takeda); seconal sodium capsules (Ranbaxy); and zaleplon capsules ( Sonata , King Pharmaceuticals).

Adverse events potentially related to use of these sedative-hypnotic products should be reported to the manufacturer and to the FDA's MedWatch reporting program by phone at 1-800-FDA-1088, by fax at 1-800-FDA-0178, online at http://www.fda.gov/medwatch , or by mail to 5600 Fishers Lane, Rockville, MD 20852-9787.

Chronic NSAID Use Doubles CV Deaths in Elderly

From Heartwire

Lisa Nainggolan
July 14, 2011 (Gainesville, Florida)

Older patients with hypertension and coronary artery disease who use nonsteroidal anti-inflammatory drugs (NSAIDs) chronically for pain are at significantly increased risk of cardiovascular events, a new post hoc analysis from the International Verapamil-Trandolapril Study (INVEST) demonstrates. The research is published in the July 2011 issue of the American Journal of Medicine.

"We found a significant increase in adverse cardiovascular outcomes, primary driven by an increase in cardiovascular mortality," lead author Dr Anthony A Bavry (University of Florida, Gainesville) told heartwire . "This is not the first study to show there is potential harm with these agents, but I think it further solidifies that concern."

He says the observational study, conducted within the hypertension trial INVEST, is particularly relevant to everyday practice because the patients included were typical of those seen in internal-medicine, geriatric, and cardiology clinics--they were older, with hypertension and clinically stable CAD.

Bavry and colleagues were not able to differentiate between NSAIDs in the study--most people were taking ibuprofen, naproxen, or celecoxib--and he says until further work is done, he considers the risks of NSAIDs "a class effect," and their use should be avoided wherever possible.

I try to get them to switch to an alternative agent, such as acetaminophen.
However, "Patients should not terminate these medicines on their own," he says. "They should have a discussion with their physician. When I see patients like these taking NSAIDs I will have an informed discussion with them and tell them there is evidence that these agents may be associated with harm. I try to get them to switch to an alternative agent, such as acetaminophen, or if that's not possible I at least try to get them to reduce the dose of NSAID or the frequency of dosing. But ultimately, it's up to them if this potential risk is worth taking depending upon the indication for their use."
 
Chronic NSAID Use More Than Doubles CV Mortality
Within the large cohort of more than 22 000 patients in INVEST, Bavry and colleagues identified patients who reported taking NSAIDs at every follow-up visit and termed them chronic users (n=882).
Most often, patients were taking these agents for conditions such as rheumatoid arthritis, osteoarthritis, and lower back pain, Bavry said.
They compared the chronic NSAID users with those who only intermittently (n=7286) or never (n=14 408) used NSAIDs over an average of 2.7 years and adjusted the findings for potential confounders.

The primary outcome--a composite of all-cause death, nonfatal MI, or nonfatal stroke--occurred at a rate of 4.4 events per 100 patient-years in the chronic-NSAID group vs 3.7 events per 100 patient-years in the nonchronic group (adjusted hazard ratio 1.47; p=0.0003).

As noted by Bavry, the end point was primarily driven by a more than doubling in the risk of death from CV causes in the chronic-NSAID group compared with never or infrequent users (adjusted HR 2.26; p<0.0001).
The association did not appear to be due to elevated blood pressure, the researchers say, because chronic NSAID users actually had slightly lower on-treatment BP over the follow-up period.

They note that a recent American Geriatrics Society panel on the treatment of chronic pain in the elderly recommends acetaminophen as a first-line agent and suggests that nonselective NSAIDs or COX-2 inhibitors be used only with extreme caution. "Our findings support this recommendation," they state.
Bavry added: "We do need more studies to further characterize the risks of these agents, which are widely used and widely available, and perhaps the risks are underappreciated. We are working on the next level of studies to try to identify which are the most harmful agents."
Bavry has no disclosures. Disclosures for the coauthors are listed in the paper.

Thursday, July 7, 2011

Make No Mistake: Vaccine Administration, Storage, and Handling

CDC Expert Commentary
Andrew T. Kroger, MD, MPH

Vaccine Administration

Vaccine administration is a critical component of a successful immunization program. We label the 7 steps to successful immunization the "rights of medication administration." The word "right" implies "correct" -- the correct steps to ensuring successful administration.
  1. Right #1 - the right patient. Make sure you are vaccinating the right person in the room, and also that screening has been performed to identify which vaccines are needed and which vaccines should be avoided because of medical conditions.
  2. Right #2 - the right vaccine. Check your vials 3 times to make sure you have the correct vaccine in hand.
  3. Right #3 - the right time. Make sure the patient is the appropriate age and is being vaccinated at an appropriate interval from other doses of the same or different vaccines. Vaccines and their diluents might expire as well, so check those dates.
  4. Right #4 - the right dosage. Vaccine dosage is based on the age of the patient, not the weight. Vaccines differ from medications in this respect.
  5. Right #5 - the right route. Whether oral, intranasal, subcutaneous, or intramuscular, this varies by the type of vaccine, and requires the appropriate administration technique. Correct needle length is also essential.
  6. Right #6 - the right site. This is partially dependent on the correct route, and is also related to the age of the patient. Resources are available to assist in the determination of route, site, technique, and needle length. For instance, for vaccines administered by the intramuscular route, a table in the January 2011 General Recommendations on Immunization [1] (Table 10) guides administration by this route according to age, gender, site, and technique.
  7. Right #7 - the right documentation. This is critical to ensure not only that your patient receives the correct number of doses to be adequately protected, but that excessive doses are not provided, which can cause mild local reactions and can waste valuable vaccine.
All staff (permanent and temporary) who administer vaccines should receive competency-based training and education on vaccine administration before administering vaccines to patients. Staff knowledge and skills should be validated with a skills checklist. Furthermore, all staff should receive continuing education when there are new schedules, vaccines, or recommendations.

Vaccine Storage and Handling

Each office should develop and maintain a detailed written storage and handling protocol; assign storage and handling responsibilities to a single person; designate a backup person; and provide training on vaccine storage and handling.
It is also important to prevent storage and handling errors. Maintaining vaccines at the correct temperature is critical to maintaining potency and protection. Vaccines must be stored properly from the time they are manufactured until they are administered to your patients. Vaccines stored at incorrect temperature can cost thousands of dollars in wasted vaccine and revaccination. The cold chain, which is a temperature-controlled supply chain, begins with the manufacturer and continues with the transfer of vaccine to the distributor; transfer from the distributor to the provider's office; and administration to the patient. Proper storage temperatures must be maintained at every link in the chain. These temperatures are defined in the package inserts for each product and in Table 11 of the General Recommendations on Immunization .

Vaccine storage units must be selected carefully and used properly. Refrigerators without freezers, and stand-alone freezers, are preferred because they are better than combination refrigerator-freezer units at maintaining the required temperatures. Any refrigerator or freezer used for vaccine storage must have its own exterior door and must be able to maintain the required temperature range throughout the year. It must be large enough to hold the year's largest vaccine inventory, and must be dedicated to the storage of biologics.
Proper temperature monitoring is vital to proper cold chain management. Check the storage temperatures twice a day -- once in the morning and once before you leave at the end of the workday -- and record the temperature readings twice daily. However, documentation is not enough. Equally important is taking immediate corrective action when the temperatures fall outside the recommended ranges. Remember, any mishandled or incorrectly stored vaccine should not be administered. It is especially important that inactivated vaccine that has been exposed to freezing temperature not be administered.

If you discover that your refrigerated vaccine has been exposed to freezing temperatures -- even if the vaccines do not appear to have been frozen -- you should remove and identify the exposed vaccine so it will not be used. Then contact the manufacturer or your state or local immunization program for advice. You should do the same thing if your freezer temperature rises above 5°F during other than the normal defrost cycle.