From news of the weird: Wrong-sided surgery


Admittedly, this is not where I usually look for information on medical quality and safety measures – but this case, as presented in News of the Weird for this week deserves mention:

Neurosurgeon Denise Crute left Colorado in 2005 after admitting to four serious mistakes (including wrong-side surgeries on patients’ brain and spine) and left Illinois several years after that, when the state medical board concluded that she made three more serious mistakes (including another wrong-side spine surgery).

Nonetheless, she was not formally “disciplined” by either state in that she was permitted merely to “surrender” her licenses, which the profession does not regard as “discipline.” In November, Denver’s KMGH-TV reported that Dr. Crute had landed a job at the prestigious Mount Sinai Medical Center in New York, where she treats post-surgery patients (and she informed Illinois officials recently that she is fully licensed in New York to resume performing neurosurgery). [KMGH-TV, 11-4-2012]”

This is an excellent example of the importance of the ‘Time-out” which includes ‘surgical site verification’ among all members of the surgical team.  This also shows some of the limitations in relying on the health care professions to police themselves.  Does this mean that I can absolutely guarantee that this won’t happen in any of the operating rooms I’ve observed?  No – but it does mean that I can observe and report any irregularities witnessed (or deviations from accepted protocols) – such as ‘correct side verification’ or failure of the operating surgeon to review medical records/ radiographs prior to surgery.

It also goes to show that despite lengthy credentialing processes and the reputations of some of the United States finest institutions are still no guarantee of quality or even competence.

What about Leapfrog?

This comes at the same time as the highly controversial Leapfrog grades are released – in which medical giants like UCLA and the Cleveland Clinic received failing marks.  (UCLA received an ‘F” for avoidable patient harm, and the Cleveland Clinic received a “D”.)

Notably, the accuracy of the Leapfrog scoring system has been under fire since it’s inception – particularly since the organization charges hospitals for the right to promote their score.

But then – as the linked article points out – so do most of the organizations ‘touting’ to have the goods on the facilities such as U.S. News and Reports and their famed hospital edition.

Guess there aren’t very many people like me – that feel like that’s a bit of a conflict of interest..

Fundacion Santa Fe de Bogota ranks among the best in Latin America


Santa Fe de Bogotá ranked second in Latin America

In the most recent American Economics (AmericaEconomica), Fundacion Santa Fe de Bogotá ranked second in the category of “Capital Humano” coming in just behind Clinica Alemana, in Santiago, Chile.  Fundacion Santa Fe ranked #4 overall.

Capital Humano

This category ranks and measures the education, training and research among the staff of each facility, as well as on-going improvement projects and educational offerings.

Of course, it’s no surprise to readers of Hidden Gem that the surgeons over at Santa Fe de Bogotá excel at academic excellence.

Now – while we give Kudos to Santa Fe de Bogotá, as well as Hospital Israelita Albert Einstein (Brazil) and Clinica Alemana for their outstanding rankings – we remind readers that rankings aren’t always what they are cracked up to be.

AmericaEconomica, “The best hospitals and clinics in Latin America.”

Hospital ranks and measures: Medical Tourism edition?


It looks like Consumer Reports is the newest group to add their two cent’s worth about hospital safety, and hospital safety ratings.  The magazine has compiled their own listing and ratings for over 1,100 American hospitals.  Surprisingly, just 158 received sixty or greater points (out of a 100 possible.)  This comes on the heels of the most recent release of the LeapFrog results.  (Leapfrog is controversial within American healthcare due to the unequal weight it gives to many of its criterion.  For example, it is heavily weighed in favor of very large institutions versus small facilities with similar outcomes.)

Consumer Reports has a history providing consumers with independent evaluations and critiques of market products from cars to toasters since it’s inception in the 1930’s.  It’s advent into healthcare is welcome, as the USA embraces new challenges with ObamaCare, mandated EMRs, and pay-for-performance.

While there is no perfect system, it remains critical to measure outcomes and performances on both an individual (physician) and facility wide scale.  That’s why I say; the more scales, scoring systems and measures used to evaluate these issues – the better chance we have to accurately capture this information.

But – with all of the increased scrutiny of American hospitals, can more further investigation into the practices and safety at facilities promoting medical tourism overseas be far behind?

Now it looks like James Goldberg, a bioengineer that we talked about before, is going to be doing just that.  Mr. Goldberg, who is also an author of the topic of medical tourism safety recently announced that his firm will begin offering consulting services to consumers interested in knowing more about medical tourism – and making educated decisions to find the most qualified doctors and hospitals when traveling for care.  He may be one of the first to address this in the medical tourism industry, but you can bet that he won’t be the last..

If so, the winners in the international edition will be the providers and facilities that embrace transparency and accountability from the very beginning.

Reputation, Ranking and Objective Measures:


Reputation,  Rankings and Objective measures

The top-10 heart and heart surgery hospitals (according to US News 2011) were as follows:

  1. Cleveland Clinic
  2. Mayo Clinic
  3. Johns Hopkins
  4. Texas Heart Institute at St Luke’s Episcopal
  5. Massachusetts General
  6. New York Presbyterian University
  7. Duke University Medical Center
  8. Brigham and Women’s Hospital
  9. Ronald Reagan UCLA Medical Center
  10. Hospital of the University of Pennsylvania

(US News, July 19, 2011)

The First shall be First..

Well, the latest US News hospital rankings are out – and as usual, John Hopkins is at the top of the list – as they have been for the last seventeen years.  Or are they on the top of the list because they were ranked #1 for the previous sixteen years?

How much do these or any rankings actually reflect the reality of the health care provided?  What are they really measuring?  These are important questions to consider.  While US News uses these rankings to sell magazines, other people are using these results to plan their medical care.

 So, what do these rankings or studies show[1]?  The answer depends on two things:

1.  Who you ask.  2. The measure(s) used.

Reed Miller, over at Heartwire.com reported the results of a study by Dr. Ashwini Sehgal over at Case Western Reserve examining the US News Rankings back in 2010 (and re-posted below.)  Dr. Sehgal explains that much of what the US News is measuring is not scientific, nor objective data – it’s public opinion, which as we all know, may have little basis in actual facts.  Ask any fifteen-year- old girl who is the most qualified candidate for president – now imagine Justin Bieber in the White House[2].  An extreme example, to be sure – but one that fully illustrates the pitfalls of relying on this sort of subjective data.

News versus Tabloid

This isn’t the first time that the magazine has come under scrutiny for the methodology of their ‘ranking’ practices.  Teasley (1996) exposed similar flaws in their ranking schemes almost fifteen years ago.  Green, Winfield, Krasner & Wells (1997) explained in JAMA that there
were additional limitations to US News approaches due to a lack of availability of standardized data, despite the magazine using what they considered to be a strong conceptual design.  They cite the same concerns with the weight given to reputation as a majority deficiency.

However,  these significant oversights does not prevent the media and hospitals from continuing to present their results as a legitimate measure of  performance. In fact, more people know about these rankings than they do about government data collected for the same purpose.

Core Measures

Compare this well-known ranking, with governmental attempts to quantify and compare American hospitals.  Medicare and Health and Human Services quantifies and ranks hospital  performance using a ‘score card’ scenario known as “Hospital Compare.”

While this government system is far from perfect since it relies heavily on individual physician documentation, it is an evidence-based measurement tool, making it far more objective.  The government rating system uses a series of specific criteria called Core Measures.  These core measures are used to evaluate adherence to accepted treatment strategies for different conditions such as heart failure, heart attack, and pneumonia.  This data is then published on-line for consumers.

The advantages to measurement tools such as Core Measures is that it an easily applied checklist type scoring system.

For example, the core measures used to evaluate the appropriateness of treatment for an acute myocardial infarction (heart attack) are pretty clear cut:

– Amount of time in minutes for patient to receive either cardiac cath or thrombolytic drugs “clot busters”

– How long (minutes) for patient to receive first EKG after presenting with complaints consistent with AMI

– Did patient receive aspirin on arrival?

– Did patient receive ACE/ ARB for LV dysfunction?

– Did patient receive scripts for beta blockers, ACE/ ARB, aspirin at discharge?

As you can see – all of these measurements are clear, easily defined and objective in nature.  The main problem with core measures in many institution is getting doctors to clearly document whether or not they instituted these measures.  (But that too reflects on the institution, so hospitals with multiple staff members not adhering to the national guidelines will have lower scores than other facilities.)  In fact, this is the main criticism of this measurement tool – and this criticism often comes from the very doctors that omit this data.  (In recent years – hospitals have tried to address this shortcoming by making documentation an easier, more streamlined process – and allowing other members of the health care team to participate in this documentation.)

Then this data is compared to other hospitals nationwide, with subsequent percentile ratings, and status.  Ie. a hospital may rank higher or lower than national average for death rate or re-admission for heart attack, pneumonia, post-surgical infection or several other diagnoses/ conditions.  Consumers can also use this database to compare different facilities to each other (such as several hospitals in a local area).

The accessibility and publication of this data for health care consumers is a very real and meaningful public service.  This allows people to make more informed choices about their care, without relying on third-party anecdotes, or reputation alone.

How does this tie in with surgical tourism?  (or what does this have to do with Bogotá Surgery?)

As part of my efforts to provide objective, unbiased information on the institutions, physicians and surgical procedures in Bogotá, Colombia, I applied the Core Measures criteria as part of my evaluation.  I used these measures not on an institutional level, but on an individual provider level – to each and every surgeon that participated in this project.

However, core measures (NSQIP) was not the only tool I used during my assessment.  I also used several other measurements to get a fair/ well-balanced evaluation of the providers listed in my publication.  (Other criteria used  as part of this process will be discussed more fully in a future post.)

Surgical tourism information needs to be clear, objective and meaningful to be of use to potential consumers.  Reputation alone is not sufficient when considering medical treatment either in the United States or abroad – and consumers should seek out this information to help safeguard their health.

Article Re-post from Heartwire.com

Popular best-hospital list tracks subjective reputation, but not quality measures

April 20, 2010 | Reed Miller

Cleveland, OHUS News & World Report‘s list of the top 50 hospitals
in the US reflects the subjective reputations of the institutions and not
objective measures of hospital quality, according to a new analysis [1].

The magazine’s ranking methodology includes results of a survey of 250 board-certified physicians from across the country, plus various objective data such as availability of specific medical technology, whether the hospital is a teaching institution or not, nurse-to-patient ratios, risk-adjusted mortality index based on Medicare claims, and whether the American
Nurses Credentialing Center has designated the center as a nurse magnet.

In his analysis of the US News rankings system, published April 19, 2010 in the Annals of Internal Medicine, Dr Ashwini Sehgal (Case Western Reserve University, Cleveland, OH) points out that previous investigations have compared the US News rankings with external measures and found that highly ranked cardiology hospitals had lower adjusted 30-day mortality among elderly patients with acute MI, but that many of the high-ranked centers scored poorly in providing evidence-based care for patients with MI and heart failure. Also, performance on Medicare’s core measures of MI, congestive heart failure, and community-acquired pneumonia were frequently at odds with US News rankings.

Sehgal sought to examine a broader range of measures internal to the US News system and “found little relationship between rankings and objective quality measures for most
specialties.” He concludes that “users should understand that the relative standings of US News & World Report‘s top 50 hospitals largely indicate national reputation, not objective measures of hospital quality.”

Sehgal performed multiple complementary statistical analyses of the US News & World Report 2009 rankings of the top 50 hospitals in the US, as well as the distribution of reputation scores among 100 randomly selected unranked hospitals.

He examined the association between reputation score and the total score and the connection of objective measures to reputation score. According to Sehgal’s analysis, the statistical association is strong between the total US News score and the reputation score. The association between the total US News score and total objective scores is variable, and there is minimal connection between the reputation score and objective scores.

The majority of rankings based on reputation score alone agreed with US News overall rankings. The top five heart and heart-surgery hospitals based on reputation score alone were the same as those of the US News top five heart hospitals (Cleveland Clinic, Mayo Clinic—Rochester, Johns Hopkins University, Massachusetts General Hospital, and the Texas Heart Institute), and 80% of the 20 heart and heart-surgery hospitals with the best reputation scores were also on the US News top-20 heart and heart-surgery centers.

Objective measures were relatively more influential on cardiology centers’ total scores than in some other categories, but reputation still carried a lot more weight than objective measures. Sehgal used the nonparametric Spearman rank correlation p value to assess the univariate associations among reputation score, total objective-measures score, and total US News score. The p2 value indicates the proportion of variation in ranks of one score that are accounted for by the other score.

Additional Resources and References

1.  Teasley, C. E. III (1996).  Where’s the best medicine? The hospital rating game. Eval Rev. 1996 Oct;20(5):568-79.

2. Green J,  Wintfeld  N., Krasner M.  & Wells C.  (1997).  In search of America’s best
hospitals. The promise and reality of quality assessment. JAMA. 1997 Apr 9;277(14):1152-5.

3. Sehgal, A. R. (2010). The role of reputation in U.S. News & World Report’s rankings of the top 50 American hospitals. Ann  Intern Med. 2010 Apr 20;152(8):521-5.


[1] US News may be the best known, and most widely published source, but there are multiple
studies and reports attempting to rank facilities and services nationwide.

[2] This is probably not a fair analysis given the current state of American politics.