Neurobionplus
Home About Journal AHEAD OF PRINT Current Issue Back Issues Instructions Submission Search Subscribe Blog    
Login 

Users Online: 92 
Print this page  Email this page Small font sizeDefault font sizeIncrease font size 
 


 
 Table of Contents    
EDITORIAL  
Year : 2019  |  Volume : 53  |  Issue : 2  |  Page : 221-223
Evidence-based medicine: Hype or reality?


Department of Orthopaedics, PGIMER, Chandigarh, India

Click here for correspondence address and email

Date of Web Publication22-Feb-2019
 

How to cite this article:
Dhillon MS. Evidence-based medicine: Hype or reality?. Indian J Orthop 2019;53:221-3

How to cite this URL:
Dhillon MS. Evidence-based medicine: Hype or reality?. Indian J Orthop [serial online] 2019 [cited 2019 May 23];53:221-3. Available from: http://www.ijoonline.com/text.asp?2019/53/2/221/252678
By the last decade of the 20th century, medical scientists were looking at “Tangible Evidence” to supplement the use of protocols to treat diseases or employ surgical interventions. This was demanded by physicians, the public and of course the regulators. In 1992, the Evidence Based Medicine Working group introduced the term evidence-based medicine (EBM) as a new approach to teaching the practice of medicine.[1] Sackett et al. in 1996 defined what EBM actually was and also clarified what it is not.[2] Unfortunately, as is often the fate of all that is new, only some of the good things of the concept were incorporated in practice, while a limited understanding of the overall scenario leads to a lot of misuse and disillusions.

Twenty years on, an editorial in the British Medical Journal, 2014 had very catchy title; “Evidence-Based Medicine (EBM) is broken,” it says. Spence,[3] from Glasgow, very vehemently states that current EBM is corrupted by the major influence that drug manufacturers and other vested interests have on modern-day publications. He quotes another article by Moynihan et al.,[4] which states that the focus on using information based on so-called evidence is “fueling over diagnosis and overtreatment.”

Is this a cry wolf kind of story, or is there more to this than meets the eye? The current trends in modern medicine demand doctors to look at good evidence to base treatment; more than that the patients, and often the regulators, demand that most interventions for diagnosis or therapy be based on hard evidence. This often becomes a problem as many times the clinical experience and judgment of some “knowledgeable/experienced practitioners” are pushed into the background, and the facilities available at many centers in the underdeveloped world may not correspond to those available at western centers, where most data for the current EBM originates. In addition, the current EBM practice also ignores patient preferences and local economics, which differ from area to area, and even today, factors other than pure “evidence” often come into play when deciding treatment for different patients.

So is EBM actually a broken science, or are doctors not understanding the issues involved? Well, it may actually be a bit of both. Even at its advent, in the late 1980s, EBM was clearly defined as “the conscientious, explicit, and judicious use of current best evidence in making decisions about patient care.”[2] It meant that physicians had to integrate the best available clinical evidence, which stemmed from good systematic research, add their individual clinical expertise, and then apply this to the appropriate patient. And that exactly is where the failure lies. The unmanageable volume of published research overwhelmed the users of EBM, especially the so-called “clinical guidelines” developed in the economically advanced countries. Emphasis on strict adherence to these guidelines overlooked physician experience or local practice preferences that had worked for years. In addition, there was a problem on the side of the doctors in evaluating the published research; many were not knowledgeable enough to discern what is good and what is bad evidence, and sometimes, debates arose about a concept that was supposed to be put in practice, just because it was published in a good journal, without understanding if the evidence dispensed was actually good or bad. A major issue pointed out by Glasziou et al.[5] and by Godin et al.[6] is this lack of ability to assess the quality of the evidence presented, as well as the research that has led to its publication. The evidence pyramid [7] came into being, and systematic reviews and randomized control trials have now become the cornerstone for assessing evidence applicable in particular situations. However, it is important to understand that different types of research may be needed to answer different clinical questions; for this, efficient search strategies are needed by the doctor to identify the presented evidence, and the reliability of this also has to be understood.

Two other things have also happened over the last 25 years; some vested interests were actually driving research that gave statistically significant evidence in their favor, and major allegations came up against drug and implant manufacturers. It also came to light that certain data, which were negative in connotation, were actually not published. In addition, there was suspicion that even the available published data may actually not be significant enough to influence changes in practice.

This has led many surgeons to suspect the evidence itself, and EBM was labeled as a “Hype” by the nonbelievers. Nevertheless, this may be only one extreme position or thought process; it is important for a practitioner to understand that all aspects of a disease or problem may not be addressed by currently available evidence-based publications, and many lacunae exist. This is where “experience-based medicine” steps into the fray and guides the surgeons. Other issues that are not factored in are the fact that patients with multiple comorbidities would behave differently to some procedures; in addition, economic issues and local situations may play a role in the application of “best” treatment based on evidence. More importantly, EBM does not factor in local patient preferences and physician experience, which may be different in different areas of the world.

It is also important to note that current publications propagating EBM originate in the West, where treatment protocols and available modalities are very different from that of underdeveloped countries. The application of processes and procedures that would work well in countries where patient care is subsidized and immediately applicable (like primary Tibial nailing in Grade 3 open fractures), may not be appropriate in settings with less than optimal conditions, where the same cases present with delay. A Canadian study in JBJS [8] presented evidence in favor of fixing Clavicle fractures; an experienced surgeon (maybe in India) may follow this evidence and fix a young adult with a clavicle fracture and associated injuries which demand the stability; on the other hand, he may choose not to fix a similar fracture in an elderly female, who has comorbidities and cannot afford the implant. Does this mean, he deviated from EBM practice? In fact, he is using the EBM data to apply it to some situations and using his experience to ignore it for others. Practitioners also need to understand that the so-called statistical benefits of an intervention proven to work in the western world, may actually be minimal in clinical practice in the rest of the world.

So where does this leave the average surgeon? EBM was welcomed by Dutch orthopedic surgeons in a recently published study; no such data are available from the underdeveloped world.[9] Emphasis on EBM is now reflected in increased awareness about EBM publications, better definitions of levels of evidence,[7] and more knowledge about Cochrane reviews. However, this may be a limitation in underdeveloped countries, as neither are surgeons educated in these concepts nor is the relevant literature available to all. Nevertheless, younger Orthopedic surgeons worldwide now have a better knowledge about EBM and are better equipped to evaluate the data.

To answer the question about “Hype or Reality,” I offer a simplistic viewpoint. It is a fact, that when utilized properly after appropriate evaluation, application of good-quality evidence significantly influences treatment applications. However, the other side of the coin has also to be looked at. The group that screams “EBM is reality” has not understood the limitations of current EBM; even today, good evidence is not available for focused topics, and many surgeons are unaware of evidence evaluation methods. Appropriate multicenter research is not universally available for perusal in all parts of the world. Further to that, this research has to originate from various centers in different strata of society, and from varying population groups, in large enough patient subsets to be relevant. This may change the concept of EBM to “Evidence Farming,” wherein experience of the surgeon will also be factored in, and different patient scenarios may be covered,[10] which is the so-called “patient-centered approach”. The group that screams “Hype” has also not fully understood the concept behind the science; we must realize that we cannot have guidelines for everything, and looking for answers in publications, rather than using our experience, and judgment may actually be a disservice to our patients. The education of the surgeon has to also evolve, not only in evaluating the presented facts but also in evaluation of the published evidence. This requires a lot of change in perception plus an understanding of statistical applications, and this is wisely being initiated at many forums. In addition, the quality of research has to improve, and the questions it asks have to be relevant worldwide.[11] Once these shortcomings are overcome, the shift in thinking from hype to reality would be spontaneous. Till that time, we have to take everything with a bit of caution, use our experience for specific patients, and employ “EBM” judiciously in our practice.[12]



 
   References Top

1.
Evidence-Based Medicine Working Group. Evidence-based medicine. A new approach to teaching the practice of medicine. JAMA 1992;268:2420-5.  Back to cited text no. 1
    
2.
Sackett DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence based medicine: What it is and what it isn't. BMJ 1996;312:71-2.  Back to cited text no. 2
    
3.
Spence D. Evidence based medicine is broken. BMJ 2014;348:g22.  Back to cited text no. 3
    
4.
Moynihan R, Doust J, Henry D. Preventing over diagnosis: How to stop harming the healthy. BMJ 2012;344:e3502.  Back to cited text no. 4
    
5.
Glasziou P, Vandenbroucke JP, Chalmers I. Assessing the quality of research. BMJ 2004;328:39-41.  Back to cited text no. 5
    
6.
Godin K, Dhillon M, Bhandari M. The three-minute appraisal of a randomized trial. Indian J Orthop 2011;45:194-6.  Back to cited text no. 6
[PUBMED]  [Full text]  
7.
Murad MH, Asi N, Alsawas M, Alahdab F. New evidence pyramid. Evid Based Med 2016;21:125-7.  Back to cited text no. 7
    
8.
Canadian Orthopaedic Trauma Society. Nonoperative treatment compared with plate fixation of displaced midshaft clavicular fractures. A multicenter, randomized clinical trial. J Bone Joint Surg Am 2007;89:1-0.  Back to cited text no. 8
    
9.
Poolman RW, Sierevelt IN, Farrokhyar F, Mazel JA, Blankevoort L, Bhandari M, et al. Perceptions and competence in evidence-based medicine: Are surgeons getting better? A questionnaire survey of members of the Dutch orthopaedic association. J Bone Joint Surg Am 2007;89:206-15.  Back to cited text no. 9
    
10.
Gierisch JM, Myers ER, Schmit KM, McCrory DC, Coeytaux RR, Crowley MJ, et al. Prioritization of patient-centered comparative effectiveness research for osteoarthritis. Ann Intern Med 2014;160:836-41.  Back to cited text no. 10
    
11.
Bhandari M, Sprague S, Schemitsch EH; International Hip Fracture Research Collaborative. Resolving controversies in hip fracture care: The need for large collaborative trials in hip fractures. J Orthop Trauma 2009;23:479-84.  Back to cited text no. 11
    
12.
Ioannidis JP. Hijacked evidence-based medicine: Stay the course and throw the pirates overboard. J Clin Epidemiol 2017;84:11-3.  Back to cited text no. 12
    

Top
Correspondence Address:
Dr. Mandeep S Dhillon
Department of Orthopaedics, PGIMER, Chandigarh
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None


DOI: 10.4103/ortho.IJOrtho_54_19

Rights and Permissions




 

Top
 
 
 
  Search
 
 
    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
    Email Alert *
    Add to My List *
* Registration required (free)  
 


 
    References
 

 Article Access Statistics
    Viewed994    
    Printed75    
    Emailed0    
    PDF Downloaded84    
    Comments [Add]    

Recommend this journal