Home About Journal AHEAD OF PRINT Current Issue Back Issues Instructions Submission Search Subscribe Blog    

Users Online: 105 
Print this page  Email this page Small font sizeDefault font sizeIncrease font size 

SYMPOSIUM Table of Contents   
Year : 2008  |  Volume : 42  |  Issue : 2  |  Page : 104-110
Evidence-based orthopaedics: A brief history

Division of Orthopaedic Surgery, McMaster University, Hamilton, Ontario, Canada

Click here for correspondence address and email


Evidence-based medicine was recently noted as one of the top 15 most important medical discoveries over the past 160 years. Since the term was coined in 1990, EBM has seen unparalleled adoption in medicine and surgery. We discuss the early origins of EBM and its dissemination in medicine, especially orthopaedic surgery.

Keywords: EBM, evidence, evidence-based medicine, history

How to cite this article:
Hoppe DJ, Bhandari M. Evidence-based orthopaedics: A brief history. Indian J Orthop 2008;42:104-10

How to cite this URL:
Hoppe DJ, Bhandari M. Evidence-based orthopaedics: A brief history. Indian J Orthop [serial online] 2008 [cited 2020 Feb 22];42:104-10. Available from:

   Introduction Top

The British Medical Journal compiled a list in 2007 of the 15 most important medical milestones since the journal's inception in 1840. [1] Included were the discovery of DNA, the development of vaccinations and of antibiotics, the use of anesthetics for surgery and the emergence of evidence-based medicine. Evidence-based medicine (EBM) is an approach to the practice of medicine, whose name was coined by Gordan Guyatt in 1991 [2] and which was described by the Evidence-Based Medicine Working Group at McMaster University in 1992. [3] It was a new paradigm that placed less emphasis on expert opinion and unsystematic clinical observations, instead stressing the impact of evidence derived from clinical research, such as randomized-controlled trials and the need for physicians to make themselves aware of published results before blindly accepting dogma.

Subsequently, there has been an explosion of research papers expanding the boundaries of EBM into many specialties of medicine, even including traditional Chinese medicine. [4] The medical and health communities have embraced this methodology with great enthusiasm, to such an extent that one would be hard-pressed to find a physician today who has not heard of the term, EBM! In orthopaedics, the terminology collectively referred to as Evidence-Based Orthopaedics has also become a standard language of journals and major orthopaedic societies such as the Indian Journal of Orthopaedics, Journal of Bone and Joint Surgery, Clinical Orthopaedics and Related Research and Acta Orthopaedica. Furthermore, EBM has also evolved from an initial focus only on the best available published evidence for a treatment to the present emphasis on the importance of patient values and expected outcomes on management and treatment of disease. [5]

   What is Evidence-Based Medicine? Top

It is important to begin with an understanding of what we mean by Evidence-Based Medicine. Simply put, it is the integration of the best available research evidence, our clinical circumstances and patients' values and preferences. It can be described as a partnership between two components of the practice of medicine. One component represents the body of knowledge that includes all evidence, whether arrived at from physiological experimentation, individual observation and expert opinion, randomized controlled trials, systematic reviews or meta-analyses, as described by Sackett et al ., [5] in 1996, can be visualized as a pyramid of evidence as shown in [Figure - 1]. It is the objective and accumulated scientific and statistical wisdom derived over time that treats medicine as a scientific endeavour and demands that the user seek out the best available evidence that has been validated experimentally and statistically. It must also be recognized that some physicians may be talented diagnosticians with an acute intuitive ability to diagnose correctly. This ability may represent merely a high level of recall of scientific facts and the ability to connect knowledge to symptoms without the need of explicit logical steps.

The other component posits that such evidence alone is inadequate for making medical decisions for individual patients and that each patient's (and perhaps society's) values need to be taken into consideration and that the choice of treatment must involve both patient and physician. For instance, Guyatt [2] contrasts two patients, both with pneumococcal pneunomia for which scientific evidence points to antibiotics as the best treatment, but where does the context of the patient comes to the forefront as a critical factor in the decision whether to treat with antibiotics or not? In addition to values, this side of EBM also relies on outcomes, namely weighing of treatments according to how they will translate into improved quality of life in a patient.

The synthesis of this partnership between brain and heart is EBM, which has matured in its fifteenth year into an established set of principles and guidelines for the practice of medicine, as found for instance in the Users' Guide to the Evidence Series in JAMA. [6]

   The Early Practice of Orthopaedics: Opinion Over Evidence? Top

Consider the lead article in the Proceedings of the American Orthopaedic Association in 1889 with the title "Hypertrophy of One Lower Extremity", [7] in which the author, an orthopaedic surgeon, presented a case at a professional meeting that described treatment of his patient, a six year-old child, with a (diseased) leg three-quarters of an inch longer than the other. He prescribed application of a rubber bandage but the diseased leg's growth continued to outpace the other and after a year he recommended only that the child wear a high shoe on the better leg for comfort. The surgeon noted small cysts deep beneath the skin but did not observe any inflammation of the skin that might be connected to elephantiasis. The patient was later examined by another specialist who diagnosed congenital occlusion and dilation of the lymph channels and recommended amputation, which was carried out. Following post-surgery tissue examination the author thought that the limb growth could be accounted for by retention of lymph caused by parasite, in the same way that the presence of parasite filaria and eggs cause obstruction in elephantasis.

Essentially, this author has presented an unsystematic clinical observation on the subject, [8] which he has shared with other physicians. He has also attempted to find a cause for his observations through a pathological investigation of post-mortem tissues where he has been influenced by knowledge of a different condition that manifests similar symptoms.

His presentation, written up for journal publication, was followed with a discussion by other specialists. One described a patient, a twenty-one year-old female with a similar increase in size in one leg for whom he also prescribed a high shoe. He "did not know what diagnosis to make" and referred the patient to another surgeon who also confessed "ignorance of the nature of the problem." A second discussant mentioned a similar case that he treated by stretching the sciatic nerve, which reduced the size of the affected limb. His choice of treatment was motivated by a case of elephantiasis a decade previously in which he removed an inch of the sciatic nerve with a subsequent lessening of the size of the patient's calf.

These surgeons, all from different cities, offered radically divergent treatments for a particular type of affliction: shoe lifts, sciatic nerve stretching and amputation. As this journal was the official organ of the American Orthopaedic Association, one would expect it to provide the best source of information to practitioners. But how could a reader decide which treatment to adopt if he were to encounter a similar case in his own practice? Which expert was he to believe? And what if he didn't subscribe to this journal or attend the meeting? How could the information presented be transmitted?

It is clear that each of the discussants offered their expert opinions. That is why they were discussants. Their knowledge was based on observations in their own practices on a case-by-case basis, which reinforced their understanding of the value of a technique or treatment. Such knowledge would then be transmitted orally to students through mentoring or to other doctors in clinical rounds or in discussions following pathological examinations. The wisdom that is imparted this way sometimes results in aphorisms such as "If you hear hoofbeats, think horses, not zebras" although, as pointed out by Groopman in his recent best-seller "How Doctors Think", [9] when a physician's reasoning is unduly influenced by what is thought to be typically true, regardless of the evidence, errors in diagnosis may sometimes result. When experts differ, a physician may be swayed by the opinion of the physician with whom one is better acquainted or by whoever has the stronger reputation or comes from the more prestigious institution.

Interestingly, one contributor to the discussion recognized the need to reconcile the different opinions and raised the following suggestion: "Would it not be in accordance with the purposes of this Association to appoint a committee to investigate this subject, taking patients… and treating them…" The implication is that anecdotal evidence based on individual cases or experience is insufficient evidence to adjudicate the efficacy of a treatment and instead trials are needed to objectively demonstrate the benefit of one cure over another. Could it be that a century ago the importance of large clinical trials was being recognized?

In addition to expert opinion, a physician, searching for understanding, would also be influenced by his acquaintance with experimental physiology. Knowledge of how the body functions and reacts to stimuli or foreign matter is derived from laboratory experiments, generally on animals, to validate or invalidate hypotheses. This scientific method is the basis for research in the biological, physical and chemical sciences. The discovery of penicillin by Alexander Fleming in 1929 [10] is an illustration of how an accidental observation of differential bacterial growth in a  Petri dish More Details, stimulated by scientific curiosity, led first to the development of penicillin and then to an understanding of the physiological basis of antibiotics. In this first article of the journal cited above, the author did, in fact, also attempt some experimentation by carrying out post-mortem tissue examination in order to find clues to the cause of his patient's condition and thereby increase his understanding of the problem for future cases.

   Laying the Foundation for EBM: Early Clinical Trials Top

The manner in which medical knowledge develops has changed over time. As Geoff Watts has written: "Knowledge doesn't suddenly appear in neat and tidy quanta. Like patches of lichen spreading over a rock face, it accretes over decades". [1] Each key development is built upon by earlier ideas.

Clinical trials are not a recent research tool. There is actually evidence of what may be the first clinical trial in the biblical book of Daniel, describing events that occurred over 2600 years ago. Neuhauser and Diaz [11] provide a refreshing look at the "original clinical trial" in their article on the subject in 2004. Basically, King Nebuchadnezzar wanted the Israelite children to eat a diet of the king's meats and wines. The prophet Daniel, believing (hypothesizing) that a diet of beans, lentils and water would be healthier than the king's diet, formed an experimental group of himself and three other children and asked to be compared to the rest of the children after a 10-day trial of his diet. Indeed, after 10 days had passed, the experimental group was compared with the other children (the "control group") and "their countenances appeared fairer and fatter in flesh than all the children which did eat the portion of the king's meat".

Two famous clinical trials were carried out in the 18 th century, trials on scurvy and on smallpox. Many physicians are aware of the work by Sir James Lind in the prevention of scurvy in 1747. [12] Lind divided 12 similar patients with scurvy into six groups of two each and placed each group on a different diet. One group received oranges and lemons. In effect, Lind was carrying out a one-way analysis of variance. He found that oranges and lemons provided the best treatment.

The discovery by William Jenner of using cowpox vaccine to immunize against smallpox was preceded by both clinical observations and clinical trials. Inoculations had been around for centuries, dating back to 10 th century China and India, [13] but their use had been justified based on clinical observation that those inoculated were less sick than the infected. In China, these were performed by placing cotton soaked in infected pus into subjects' noses. [14] Cotton Mather, a colonial minister living in Boston during the smallpox epidemics of the 1720's, had seen such practice firsthand during time spent in West Africa. He convinced a local physician to inoculate his patients, although many people in America were against it, including all of the local physicians. In fact, the townspeople were so incensed that a bomb was thrown into Mather's house, even though it did not end up exploding. However, Mather persevered and he tallied and compared the mortality rate of those who were inoculated with the local population. In total, 6 out of 287 (2.1%) inoculated patients died compared with 842 deaths out of the 4917 (17.1%) who received no treatment. [14] This type of study would be what is now referred to as cohort trial, in which exposed and non-exposed groups of patients are followed forward in time and monitored for the occurrence of a predicted outcome, in this case mortality. [8]

Another step forward on the path to EBM was the introduction of the randomized-controlled trial into the medical literature (RCT). One of the first truly randomized trials in medicine was published in 1931 in the American Review of Tuberculosis by J Burns Amberson, a staff physician at Detroit Municipal Tuberculosis Sanatorium in Detroit. [15] He divided 24 patients into two groups of 12, based on approximately matched pairs. By flip of a coin, one group became control, treated with injections of distilled water and the other was treated with sanocrysin, a gold preparation. This is an example of a randomized block design and although the results showed no therapeutic benefit, [16] the methodology was important for setting a standard in how clinical trials should be undertaken.

Two decades later, another randomized controlled trial was carried out by the Streptomycin in Tuberculosis Trials Committees of the British Medical Research Council. [17] This was a multi-centre, double-blinded clinical trial and served as a model for future designs. The patients were randomized by using numbered envelopes and their progress was evaluated through monthly chest x-rays read by three specialists who did not know whether the patients had received streptomycin or the control, which was bed rest. The results of this clinical trial showed that the death rate was significantly lower for patients receiving streptomycin.

Perhaps the largest single clinical trial carried out in the 20 th century was the Salk polio vaccine field trial in 1954. [18] It was understood at the time how the poliomyelitis virus entered the system and how it affects the central nervous system causing paralysis and sometimes death. It was also observed that severe polio was rarer in communities with poor hygiene, leading to the hypothesis that children in these communities were conferred immunity by mild exposures to the virus. Because polio was so rare, an enormous number of participants (over 400,000 children) were needed for the trials in order to observe any possible significant effect. Roughly half were vaccinated and half received a placebo of salt water. The results confirmed the effectiveness of the Salk vaccine and led to large-scale inoculation of school children. This RCT was also double blind so the examining doctors would not bias their diagnoses. Such a trial is considered the gold standard of designs.

As randomized controlled trials became more common, studies were published that contradicted conventional wisdom, thus showing the necessity of making clinical decisions based on evidence rather than on observation and physiological principles. An outstanding example of this is the discontinuation of hormone replacement therapy (HRT) after the 2002 Women's Health Initiative (WHI) trial. [19] HRT had been recommended since 1985 for prevention of osteoporosis, dementia and heart disease, as well as to improve the general quality of life of postmenopausal women. The basis of this recommendation was clinical observations that women taking HRT seemed to be healthier than those not taking it. Pharmaceutical advertising influenced physicians in prescribing it for their patients and there were some attempts to explain the benefits physiologically using laboratory experiments. Unfortunately, the WHI trials showed increased risk of heart disease, breast cancer and stroke in women taking HRT and the trials were actually stopped early. The WHI trials took place about 10 years after the introduction of EBM and are a striking example of the effect of EBM on medical thinking.

However, most clinical trials were not of the magnitude of these large trials and involved few patients, which sometimes produced conflicting or inconclusive results. Clearly, something needed to be done to assess the quality of each study (now called critical appraisal) and a quantitative method needed to be developed to draw conclusions from the results of multiple studies on a single topic (statistical reviews and meta-analyses).

   Critical Appraisal: The Late 1970s and Early 1980s Top

The ingredients of EBM that emphasize finding the "best" evidence from the literature were already taking root and practiced at McMaster University in the late 1970s and early 1980s by David Sackett, who used the term "critical appraisal" to describe the systematic examination of the medical literature to extract evidence. The term "Evidence-Based Medicine", however, was actually coined by Professor Gordon Guyatt in 1990 in a brochure for internal medicine residency applicants to McMaster University. In this early description, EBM was described as an "enlightened skepticism" towards the use of diagnostic, prognostic and therapeutic technologies. The result of this early work was a series of "Readers' Guides" articles by McMaster colleagues in the Canadian Medical Association Journal [20] followed by several texts. [21],[22]

The initial intention of EBM was educational, to train residents to become better physicians. This was consistent with the philosophy underlying the unique approach to medical education at McMaster's nascent M.D. program and the university's focus on innovation in education, an emphasis that remains today. It also recognized that physicians in a busy practice have limited leisure time to peruse the literature and part of the training was concerned with efficient methods for extracting information from literature in a timely fashion. Soon, faculty became intrigued by what their students were learning and also became interested in understanding this new approach. The advent of microcomputers around this time also gave an impetus to facilitating searching, although not to the incredible extent that physicians are able to locate information today through the internet and associated electronic searching capabilities of databases, document repositories and documents themselves. As well research can now be published online immediately avoiding the lag in the past between completion of a paper and its distribution.

Membership in the group of physicians interested in critical appraisal increased to encompass physicians in Clinical Epidemiology and Biostatistics, Medicine, Obstetrics and Gynecology, Pediatrics and Emergency Medicine, not only at McMaster, but elsewhere. This group evolved into the Evidence-Based Working Group and culminated in adoption of the term EBM and publication of the fundamental paper [3] announcing this approach as a new paradigm.

   Advances in Meta-Analysis and Systematic Reviews: The Mid-1980s Top

During the period that the McMaster group was emphasizing the importance of examining the literature in understanding the efficacy of medical treatments, a similar revolution was occurring in the social sciences [23] with the development of meta-analysis. It originated as quantitative research tool to allow researchers to combine and synthesize the results of a large number of separate studies in order to gather evidence pertinent to a particular topic in the hope that, taken together, the data as a whole would either confirm or dispel a claim (or hypothesis).

Meta-analyses gain power by pooling the results of many small trials and help determine whether a suggested treatment shows clear evidence of effectiveness; whether the results, though inconclusive, merit additional trials because the treatment appeared promising or whether it should be altogether abandoned.

The importance of meta-analysis in medicine was clearly identified in a seminal book by Chalmers, Enkin and Kierse. [24] These authors searched the literature to gather information on an enormous number of randomized clinical trials and then organized teams of physicians to assess the research quality of the trials and to carry out meta-analyses on various suggested treatments. Recommendations for physicians and nurses were described in a less technical publication. [25] Meta-analyses, are now increasingly carried out in medicine, for instance in orthopaedics [26] and provide one of the highest levels in the hierarchy of evidence.

   Cochrane Collaboration: The 1990s Top

Although out of the scope of pre-EBM developments, no discussion of EBM would be complete without mention of the Cochrane Collaboration, a leading source of reviews. The first Cochrane Centre was established by Ian Chambers at Oxford University in 1992 in response to Archie Cochrane's rebuke of the medical profession for not having established a database, to be regularly updated, of published clinical trials according to specialty. This was followed by the second Centre at McMaster University in 1994, leading to the Cochrane Collaboration, which has been a repository since 1996 of systematic reviews and critical appraisals of the medical and health literature as part of the Cochrane Library. Jadad et al. , published a comparison of Cochrane reviews with published articles, which showed that Cochrane reviews are more rigorous methodologically and are more frequently updated than systematic reviews or meta-analyses published in journals. [27]

   The Evolution of EBM in Orthopaedics: The 21st Century Top

We began this paper with a discussion of the first article in the Proceedings of the American Orthopaedic Association. Fast forward from this initial illustration of 19 th century medical learning to the year 2000, a full century later. This same journal, now named the Journal of Bone and Joint Surgery, in recognition of the need to integrate clinical expertise with the best available systematic research, introduced a new section, "Evidence-Based Orthopaedics". [28] In the introduction to this new section, the editors wrote that randomized clinical trials would form the main contribution because they are believed to provide the highest quality evidence and therefore when available should influence clinical decision-making. Many of the articles now published in this section are systematic reviews and meta-analyses.

Among the early papers to appear in this section were a series of four User's Guides to the Orthopaedic Literature [29],[30],[31],[32] by Bhandari, Guyatt and their collaborators, covering prognosis, surgical therapies, diagnostic tests and literature reviews. These articles were meant to teach orthopaedic surgeons the manner in which evidence can be evaluated and applied in their practice. Each of these guides begins with a scenario describing a patient of an orthopaedic surgeon with a particular condition, ending with the patient asking question to which the doctor does not know the answer or the doctor weighing two or more possible treatments.

For example, in the first User's Guide, [29] the scenario involves a patient with a displaced distal radial fracture. Based on the patient's age and fracture type, the surgeon decides that it warrants a closed reduction. However, a colleague points out that such fractures are prone to instability and suggests using a new type of bone cement. The surgeon decides to search the literature during the five hours before the OR is free in order to formulate a decision. This article then details how the surgeon would perform a literature search and the steps needed to critically appraise the articles found based on the quality of the design, whether the results are valid and whether the results are applicable to the patient at hand. Many of the points made are not obvious and the authors provide a detailed prescription that can be used not merely for the case scenario presented, but as a template for surgeons in general trying to understand how to use an article about a surgical therapy.

In 2003, the Journal of Bone and Joint Surgery decided that all clinical articles submitted for publication would have to include a level of evidence rating to classify the quality of study. [33] They chose five levels, the lowest being expert opinion and the highest being RCTs or systematic reviews of RCTs and separated the articles into four types, in order to make it clear to the reader what the purpose of an article was. The importance of these measures was two-fold: to facilitate review by the editors and to enable surgeons to more readily assess the quality of evidence and what weight to give to a study before incorporating the results into their clinical practice.

Yet, despite these refinements in the presentation of knowledge, many surgeons do not have the time to sift through even well-formulated articles and it is not clear that they would be able to translate current research into better care for their patients. [34] As a result, there is a great need for systematic reviews and meta-analyses. Similar to the User's Guides, Acta Orthopaedica has launched a series of articles on evaluating meta-analysis [35] and critical appraisal [36] to assist surgeons in this respect.

Finally, in a very recent article published February 2008, the Osteoarthritis Research International (OARSI) group published the second of two articles [37],[38] describing their recommendations for the management of hip and knee osteoarthritis, which they arrived at through a critical appraisal of existing guidelines, a systematic review of the recent research evidence and the consensus of a body of multi-disciplinary experts in primary care, rheumatology orthopaedics and EBM. It was interesting that EBM itself was considered its own specialty, together with the other three traditionally accepted specialties. This particular study represents the highest level of evidence that is possible from the current state of knowledge. It combines expert opinion with clinical trials and systematic overviews as part of a large team effort.

   References Top

1.Watts G. Let's pension off the "major breakthrough." Br Med J 2007;334:4.  Back to cited text no. 1    
2.Guyatt G. Evidence-Based Medicine: Past, Present and Future. McMaster Univ Med J 2003;1:27-32.  Back to cited text no. 2    
3.Evidence-Based Medicine Working Group. Evidence-based medicine: A new approach to teaching the practice of medicine. JAMA 1992;268:2420-5.  Back to cited text no. 3  [PUBMED]  
4.Trinh KV, Phillips SD, Ho E, Damsma K.Acupuncture for the alleviation of lateral epicondyle pain: a systematic review.Rheumatology (Oxford). 2004;43(9):1085-90. Epub 2004 Jun 22.  Back to cited text no. 4    
5.Sackett, DL, Rosenberg WM, Gray JA, Haynes RB, Richardson WS. Evidence-based medicine: What it is and what it is not. Br Med J 1996;312:71-2.  Back to cited text no. 5    
6.Oxman AD, Sackett DL, Guyatt GH; for the Evidence-based Medicine Working Group. Users' guides to the medical literature, I: How to get started. JAMA 1993;270:2093-5.  Back to cited text no. 6    
7.Packard GB. Hypertrophy of one lower extremity. Proc Am Orthop Assoc 1889;1:27-37.  Back to cited text no. 7    
8.Guyatt G, Rennie D, editors. The Users' guides to the medical literature: A manual for evidence-based clinical practice. AMA publications; 2002.  Back to cited text no. 8    
9.Groopman J. How doctors think. Boston: Houghton Mifflin Company; 2007.  Back to cited text no. 9    
10.Rosengart MR. Critical care medicine: Landmarks and legends. Surg Clin North Am 2006;86:1305-21.  Back to cited text no. 10  [PUBMED]  [FULLTEXT]
11.Neuhauser D, Daniel DM. Using the Bible to teach quality improvement methods. Qual Saf Health Care 2004;13:153-5.  Back to cited text no. 11    
12.Lind J. A Treatise of the Scurvy in Three Parts: Containing an inquiry into the Nature, Causes and Cure of that Disease, Together with a Critical and Chronological View of What Has Been published on the Subject. London; 1753.  Back to cited text no. 12    
13.Gross CP, Sepkowitz KA. The myth of the medical breakthrough: Smallpox, vaccination and Jenner reconsidered. Int J Infect Dis 1998;3:53-60.  Back to cited text no. 13    
14.Best M, Neuhauser D, Slavin L. Cotton Mather, you dog, damn you! I'll inoculate you with this; with a pox to you: Smallpox inoculation, Boston, 1721. Qual Saf Health Care 2004;13:82-3.  Back to cited text no. 14  [PUBMED]  [FULLTEXT]
15.Neuhauser D, Diaz M. Shuffle the deck, flip that coin: Randomization comes to medicine. Qual Saf Health Care 2004;13:315-6.  Back to cited text no. 15    
16.Amberson J, McMahon B, Pinner M. A clinical trial of sanocrysin in pulmonary tuberculosis. Am Rev Tuber 1931;24:401-35.  Back to cited text no. 16    
17.Medical Research Council. Streptomycin treatment of pulmonary tuberculosis. Br Med J 1948;2:769-82.  Back to cited text no. 17    
18.Silvers MJ, Steptoe MM. Historical overview of vaccines. Primary Care 2001;28:685-95.  Back to cited text no. 18  [PUBMED]  
19.Writing Group for the Women's Health Initiative Investigators. Risks and benefits of estrogen plus progestin in healthy postmenopausal women: Principal results from the Women's Health Initiative randomized controlled trial. JAMA 2002;288:323-33.  Back to cited text no. 19    
20.Department of Clinical Epidemiology and Biostatistics. How to read clinical journals: I, Why to read them and how to start reading them critically. Can Med Assoc J 1981;124:555-8.  Back to cited text no. 20    
21.Sackett DL, Haynes RB, Tugwell P. Clinical epidemiology: A basic science for clinical medicine. Boston; Little, Brown and Co; 1985.  Back to cited text no. 21    
22.Sackett DL, Haynes RB, Tugwell P, Guyatt G. Clinical epidemiology: A basic science for clinical medicine, 2 rd ed. Boston: Little, Brown and Co.; 2005.  Back to cited text no. 22    
23.Smith ML, Glass GV. Meta-analysis of psychotherapy outcome studies. Am Psychol 1977;32:752-60.  Back to cited text no. 23  [PUBMED]  
24.Chalmers I, Enkin M, Kierse MJ, editors. Effective care in pregnancy ad childbirth. New York: Oxford University Press; 1989.  Back to cited text no. 24    
25.Enkin M, Kierse MJ, Chalmers I, editors. A guide to effective care in pregnancy and childbirth. New York: Oxford University Press; 1989.  Back to cited text no. 25    
26.Montori VM, Swiontkowski MF, Cook DJ. Methodologic issues in systematic reviews and meta-analyses. Clin Orthop Relat Res 2003;413:43-54.  Back to cited text no. 26  [PUBMED]  [FULLTEXT]
27.Jadad AR, Cook DJ, Jones A, Klassen TP, Tugwell P, Moher M, et al. Methodology and reports of systematic reviews and meta-analyses: A comparison of Cochrane reviews with articles published in paper-based journals. JAMA 2003;280:278-80.  Back to cited text no. 27    
28.Swiontkowski MF, Wright JG. Introducing a new journal section: Evidence-based orthopaedics. J Bone Joint Surg 2000;82:759.  Back to cited text no. 28    
29.Bhandari M, Guyatt GH, Swiontkowski MF. User's guide to the orthopaedic literature: How to use an article about a surgical therapy. J Bone Joint Surg Am 2001;83:916-26.  Back to cited text no. 29  [PUBMED]  [FULLTEXT]
30.Bhandari M, Guyatt GH, Swiontkowski MF. User's guide to the orthopaedic literature: How to use and article about prognosis. J Bone Joint Surg Am 2001;83:1555-64.  Back to cited text no. 30  [PUBMED]  [FULLTEXT]
31.Bhandari M, Guyatt GH, Montori V, Devereaux PJ, Swiontkowski MF. User's guide to the orthopaedic literature: How to use a systematic literature review. J Bone Joint Surg Am 2002;84:1672-82.  Back to cited text no. 31  [PUBMED]  [FULLTEXT]
32.Bhandari M, Montori VM, Swiontkowski MF, Guyatt GH. User's guide to the surgical literature: How to use an article about a diagnostic test. J Bone Joint Surg Am 2003;85:1133-40.  Back to cited text no. 32  [PUBMED]  [FULLTEXT]
33.Wright JG, Swiontkowski MF, Heckman JD. Introducing levels of evidence to the journal. J Bone Joint Surg Am 2003;85:1-3.  Back to cited text no. 33    
34.Hurwitz SR, Slaawson D, Slaughnessy A. Orthopaedic information mastery: Applying evidence-based information tools to improve patient outcomes while saving orthopaedists' time. J Bone Joint Surg Am 2000;82:888-94.  Back to cited text no. 34    
35.Zlowodzki M, Poolman RW, Kerkhoffs GM, Tornetta P, Bhandari M. How to interpret a meta-analysis and judge its value as a guide for clinical practice. Acta Orthop 2007;78:598-609.  Back to cited text no. 35    
36.Poolman RW, Kerkhoffs GM, Struijs PA, Bhandari M. Don't be misled by the orthopaedic literature: Tips for critical appraisal. Acta Orthop 2007;78:162-71.  Back to cited text no. 36    
37.Zhang W, Moskowitz RW, Nuki G, Abramson S, Altman RD, Arden N, et al. OARSI recommendations for the management of hip and knee osteoarthritis, part I: Critical appraisal of existing treatment guidelines and systematic review of current research evidence. Osteoarthritis Cartilage 2008;15:981-1000.  Back to cited text no. 37    
38.Zhang W, Moskowitz RW, Nuki G, Abramson S, Altman RD, Arden N, et al. OARSI recommendations for the management of hip and knee osteoarthritis, part II: OARSI evidence-based, expert consensus guidelines. Osteoarthritis Cartilage 2008;16:137-62.  Back to cited text no. 38  [PUBMED]  [FULLTEXT]

Correspondence Address:
Daniel J Hoppe
Michael G. DeGroote School of Medicine, McMaster University Hamilton, Ontario
Login to access the Email id

Source of Support: None, Conflict of Interest: None

DOI: 10.4103/0019-5413.40244

Rights and Permissions


  [Figure - 1]

This article has been cited by
1 Performance of PROMIS for Healthy Patients Undergoing Meniscal Surgery
Kyle J. Hancock,Natalie Glass,Chris A. Anthony,Carolyn M. Hettrich,John Albright,Annunziato Amendola,Brian R. Wolf,Matthew Bollier
The Journal of Bone and Joint Surgery. 2017; 99(11): 954
[Pubmed] | [DOI]
2 Increase in Quality and Quantity of Orthopaedic Studies from 2002 to 2012
Zoe Little,Simon Newman,Alex Dodds,Dominic Spicer
Journal of Orthopaedic Surgery. 2015; 23(3): 375
[Pubmed] | [DOI]
3 Hierarchy of evidence referring to the central nervous system in a high-impact radiation oncology journal: a 10-year assessment. Descriptive critical appraisal study
Fabio Ynoe Moraes,Lorine Arias Bonifacio,Gustavo Nader Marta,Samir Abdallah Hanna,Álvaro Nagib Atallah,Vinícius Ynoe Moraes,João Luis Fernandes Silva,Heloísa Andrade Carvalho
Sao Paulo Medical Journal. 2015; 133(4): 307
[Pubmed] | [DOI]
4 Estudos prospectivos e não randomizados na ortopedia e traumatologia: avaliação sistemática da qualidade metodológica
Gustavo Soriano Pignataro,Theophilo Ásfora Lins,José Renato Assis Lemos Marques de Oliveira,Vinícius Ynoe de Moraes,Aldo Okamura,João Carlos Belloti,Flávio Faloppa
Revista Brasileira de Ortopedia. 2013; 48(2): 126
[Pubmed] | [DOI]
5 Prospective Non-randomized Studies in Orthopaedics and Traumatology: Systematic Assessment of its Methodological Quality
Gustavo Soriano Pignataro,Theophilo Ásfora Lins,José Renato Assis Lemos Marques de Oliveira,Vinícius Ynoe de Moraes,Aldo Okamura,João Carlos Belloti,Flávio Faloppa
Revista Brasileira de Ortopedia (English Edition). 2013; 48(2): 126
[Pubmed] | [DOI]
6 Prospective non-randomized studies in orthopaedics and traumatology: Systematic assessment of its methodological quality
Pignataro, G.S. and Lins, T.A. and De Oliveira, J.R.A.L.M. and De Moraes, V.Y. and Okamura, A. and Belloti, J.C. and Faloppa, F.
Revista Brasileira de Ortopedia. 2013; 48(2): 126-130
7 Dental and medical practitioners’ awareness and attitude toward evidence based practice in Riyadh, Saudi Arabia. A comparative study
Nahid Ashri,Haifa Al-Amro,Lubna Hamadah,Sahr Al-Tuwaijri,Ashraf El Metwally
King Saud University Journal of Dental Sciences. 2013;
[Pubmed] | [DOI]
8 Evidence-based practice for rehabilitation professionals: Concepts and controversies
Dijkers, M.P. and Murphy, S.L. and Krellman, J.
Archives of Physical Medicine and Rehabilitation. 2012; 93(8 SUPPL.): S164-S176
9 Evidence-Based Practice for Rehabilitation Professionals: Concepts and Controversies
Marcel P. Dijkers,Susan L. Murphy,Jason Krellman
Archives of Physical Medicine and Rehabilitation. 2012; 93(8): S164
[Pubmed] | [DOI]
10 Hierarchy of evidence relating to hand surgery in Brazilian orthopedic journals
Vinícius Ynoe de Moraes,João Carlos Belloti,Fábio Ynoe de Moraes,José Antonio Galbiatti,Evandro Pereira Palácio,João Baptista Gomes dos Santos,Flávio Faloppa
Sao Paulo Medical Journal. 2011; 129(2): 94
[Pubmed] | [DOI]
11 Hierarchy of evidence relating to hand surgery in Brazilian orthopedic journals [A hierarquia das evidências em cirurgia da mão nas revistas ortopédicas nacionais]
de Moraes, V.Y. and Belloti, J.C. and de Moraes, F.Y. and Galbiatti, J.A. and Palácio, E.P. and dos Santos, J.B.G. and Faloppa, F.
Sao Paulo Medical Journal. 2011; 129(2): 94-98
12 Evidence-based orthopedics: One step closer!
Bhandari, M., Jain, A.
Indian Journal of Orthopaedics. 2011; 45(1): 3
13 A Practical Guide to Research: Design, Execution, and Publication
Jón Karlsson, Robert G. Marx, Norimasa Nakamura, Mohit Bhandari
Arthroscopy The Journal of Arthroscopic & Related Surgery. 2011; 27(4): S1
[VIEW] | [DOI]
14 Knowledge, perceptions, attitude and educational needs of physicians to evidence based medicine in South-Western Saudi Arabia
Al-Musa, H.M.
Saudi Medical Journal. 2010; 31(3): 308-312


    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Email Alert *
    Add to My List *
* Registration required (free)  

    What is Evidence...
    The Early Practi...
    Laying the Found...
    Critical Apprais...
    Advances in Meta...
    Cochrane Collabo...
    The Evolution of...
    Article Figures

 Article Access Statistics
    PDF Downloaded526    
    Comments [Add]    
    Cited by others 14    

Recommend this journal