Evaluation of Ranking Tools
strength behind evidence is the first concern when deciding which evidence to
follow. As a busy clinician your time is valuable but strong resources is of a
virtue. New knowledge for our revolving healthcare seems to have a gap in
between how clinicians interpret and implement that knowledge in a timely
manner. Evidence based ranking tools have been put into place to assist
clinicians in choosing the most adequate, precise, and accurate data to use.
While there are many different tools used today the Oxford
Centre for Evidence-Based Medicine (OCEBM) and the Joanne
Briggs Institute Module ranking tools are two that I will compare using the
article, Sleep assessment by patients and nurses in the intensive care: An
exploratory descriptive study.
there are many different ranking tools available for use it ultimately comes
down to what kind of article you are wanting to rank, how much time you have
when choosing your articles, and which tools is most understood. Many different
tools can also be used to rank the same article but not all tools will have the
same ranking. Some Level 1’s may not be as much of strength as other Level 1’s
and vice versa (Kavanagh, 2009). When using these
tools do not allow the Levels to become a confusion, simply take each ranking
tool in its own strengths.
tool with access site
Format used on tool with the meaning for each level
of tool (where might it be used)
of the tool
of the tool
Centre for Evidence-Based Medicine 2011 Levels of Evidence (ashfk,7777). https://www.cebm.net/wp-content/uploads/2014/06/CEBM-Levels-of-Evidence-2.1.pdf
levels reflecting different questions such as diagnosis, prognosis, therapy,
– critical care
Easy to navigate
Fast and simple
Broken down into a natural flow
for clinical problems
When using simplicity not all
filters have been provided meaning we could be looking at biased articles at
Could ultimately prevent readers
from using judgment (Howick, et al., 2011)
Briggs Institute Module (ahahah,8888). http://joannabriggs.org/assets/docs/approach/Levels-of-Evidence-SupportingDocuments.pdf
down into four categories with a table used to grade each. Effectiveness,
meaningfulness, feasibility, and appropriateness. Each table has levels that
consist of sublevels to help sub categorize the articles.
conducting a systematic review
Both qualitative and
quantitative date are used
Allows articles to be upgraded
Could be the upgrading and
downgrading of an article
More complicated to navigate
through then others
More time consuming (Munn et
for Evidence-Based Medicine Tool
OCEBM was first used in September of 2000 but with being over a decade old and
having many different feedbacks regarding improvising of this tool; the OCEBM
underwent a revision that reevaluated each level of the tool (Howick, et al.,
2011). A few revisions that were made in 2009 were removing levels 1a, 1b, and
1c and replacing them with simple numerical numbers such as 1, 2, and 3 along
with switching the column and rows allowing for a more user-friendly layout (Howick,
et al., 2011). A change that impacted the usefulness of the tool was a
modification made to implement the natural flow of a clinical problem. This
helps the busy clinician’s eyes to navigate through the tool using the everyday
approach used in disease processes. Footnotes and some of the questions were removed
in ought to make the tool more easily to navigate. There was also a glossary
designed to help explain larger concepts in a way that was better understood (Howick,
et al., 2011).
new and improved OCEBM tool now has precise and easily accessible questions to
help navigate through ranking an article. The columns consist of five levels
with the rows consisting of eight questions such as, “How common is the
problem?, Is this diagnostic or monitoring test accurate?, What will happen if
we do not add a therapy?, Does this Intervention help?, What are the common
harms?, What are the rare harms?, and Is this (early detection) test
worthwhile?” (CEBM table). Although the reader might be under the conclusion
that these questions supply the most accurate form of ranking, there is still
incontinency in these tools and clinical expertise should still be included.
Joanne Briggs Institute Module
Briggs Institute model can be used in evaluating systematic reviews due to
accuracy, the directness, and consistency in this tool. This model uses a
Grading of Recommendations Assessment, Development and Evaluation (GRADE)
approach allowing this tool to evaluate more factors rather than just the
design of the article (Kavanagh, 2009). According to the article, Supporting
Document for the Joanna Briggs Institute Levels of Evidence and Grades of Recommendation,
these factors include, “critical appraisal/risk of bias, publication bias,
inconsistency, indirectness, and imprecision of evidence, effect size, dose-response
relationships, and confounders” (Munn et al., 2014). Prior to this the articles
are first pre-ranked based on the type of design that the article is. Then the
article can be upgraded and downgraded depending on the different ranking factors.
Although this ranking tool is used as a first approach in ranking the evidence
in a study, critical appraising and judgment are still the many factor in
ranking the study.
Strength and Weakness Comparison
Joan Briggs Institute model is more complex and precise ranking tool compared
to the OCEBM. These tools both have their strength and weaknesses depending on
what type of article that is being ranked. Where one tool may be more accurate
for an article rather than the other. Neither tool is more accurate or more
reliable than the other. When deciding on the ranking tool to use always keep
in mind the multiple factors included in each one. The OCEBM has some strengths
overall that help with the timeliness and easily understood concept and some
overall weaknesses that including being limited to precise levels of ranking.
The Joan Briggs model also has some strengths including the complexness of the
tool allowing factors to be upgraded and downgraded allowing for a stronger or
weaker rank. Each tool is unique in its own.
article, Sleep in the Hospitalized
Patient: Nurse and Patient Perceptions, I believe benefits more from the
OCEBM tool mainly because of the type of study that was conducted. I do not
believe the article would benefit as much from the Joan Briggs tool due to the various
factor that can be used in this rank. If the article was more complex and
different variations of quantitative data the Joan Briggs tool would be more
research you may stumble across more articles that are of no benefit compared
to ones that are but with the use of ranking tools you will have a guide in
ranking the accuracy of the article. While some researchers may have a narrower
outlook on only wanting the highest ranked articles, some articles of lower
rank have just as much quality of information. They just may not have the
evidence to support the ideas. No limitations should be given to the amount of
resources available for research. Evidence based research is the only way our
revolving healthcare will continue to hold such high quality of continuity of
J., Chalmers, I., Glasziou, P., Greenhalgh, T., Heneghan, C., Liberati, A., Moschetti,
I., Phillips, B., & Thornton, H. (2011). Explanation of the 2011 Oxford
Centre for Evidence-Based Medicine (OCEBM) Levels of Evidence (Background
Document). Oxford Centre for
Evidence-Based Medicine. Retreived from http://www.cebm.net/index.aspx?o=5653
P. (2009). The GRADE System for Rating Clinincal Guidelines. PLOS. 6(9). Doi: doi:10.1371/journal.pmed.1000094.
Z., Porritt, K., Aromataris, E., Lockwood, C., Peters, Micah. (2014). Supporting
Document for the Joanna Briggs Institute Levels of Evidence and Grades of Recommendation.
The Joanna Briggs Institute.
Retrieved from http://joannabriggs.org/assets/docs/approach/Levels-of-Evidence-SupportingDocuments.pdf.
B., Pearce, K., Redding, J., Brandonisio, S., Tzou, S., & Meiusi, E.
(2016). Sleep in the Hospitalized Patient: Nurse and
Patient Perceptions. MEDSURG
Nursing,25(5), 351-356. Retrieved from https://www.highbeam.com/doc/1G1-470159867.html.