Skip to main content
  • Other Publications
  • Subscribe
  • Contact Us
Advertisement
JCORE Reference
this is the JCORE Reference site slogan
  • Home
  • Most Read
  • About Us
    • About Us
    • Editorial Board
  • More
    • Advertising
    • Alerts
    • Feedback
    • Folders
    • Help
  • Patients
  • Reference Site Links
    • View Regions
  • Archive

Does Clinical Education Need a Series of Tools to Assess Success?

Rebecca L Craik
DOI: 10.2522/ptj.2008.88.10.1106 Published 1 October 2008
Rebecca L Craik
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • Article
  • Info & Metrics
  • PDF
Loading

Those of you who know me know that my middle name should have been “research,” because I push the need for more investigators, more funding to conduct research, and more research reports. A common mantra is the need for sound theory, evidence of mechanisms that underpin interventions, and reports of efficacy or effectiveness. There is evidence that we are making progress in generating new knowledge and that some research findings have immediate relevance to clinical practice. For example, many of the articles that appear in PTJ and other scientific journals challenge the reader to define or redefine best practice. There also is sufficient research to begin to formulate clinical guidelines for some of our practice areas.

The result of these research efforts has trickled into the classroom. In order to adopt evidence-based practice as a framework for our curricula, we needed adequate evidence in the literature to generate a discussion that leads to the choice of one intervention instead of another. We also needed studies that define the psychometric properties of the tools and studies that have used the tools for diagnostic or outcomes assessment.

In this fall academic semester, for instance, students across the country are engaged in practicing examination, evaluation, and intervention skills and searching for evidence to justify a plan of care. They also are being introduced to a variety of tests to assist them in examination and evaluation and in assessment of outcomes. Faculty members in many academic programs have moved beyond providing the general laundry lists of special tests for the students to memorize. Instead, we have begun to guide students in decision-making skills to select the test(s) with the best likelihood of affecting the post-test probability and improving their confidence in the physical therapist diagnosis. In a similar manner, faculty members emphasize the need to select the best outcome measure as a tool to document the effectiveness of the intervention. Although we have “miles to go before we sleep,” at least we now use the vocabulary of evidence-based practice and understand what has to be done in order to define best practice.

I am not sure why we aren’t asking similar “best practice” questions and using similar research methods to determine the most effective model for clinical education. Consider 2 of the papers that appear in this month's issue. APTA President Scott Ward reminded us that “the public expects our graduates to be prepared to skillfully manage their physical therapy care.”1(p1229) How do we demonstrate to the public that our graduates are prepared to “skillfully” manage? Is it sufficient to state that the new employee graduated from an accredited program and passed the licensure examination? Anthony Delitto, in the Thirty-Ninth Mary McMillan Lecture, asked us to “at least consider postprofessional, entry-level residencies as a clinical education model.”2(p1226) How do we engage in a discussion about the value of entry-level residencies or compare this model of clinical education to other models? What is the framework, and what are the tools?

We don’t seem to consider the need to evaluate the outcome of a clinical education experience in the same way that we have begun to quantify the outcomes of a clinical intervention. I am not dismissing the efforts to articulate the clinical performance standards,3 nor am I ignoring all of the effort to validate and standardize the Clinical Performance Instrument (CPI).4 Rather, I am arguing that we need another and very different tool that reports the bottom line.

Isn’t it important to know how many times—and with what types of patients—the student matched the history and impairments with the correct diagnosis?

Isn’t it important to know how many times—and with what types of patients—the student safely and efficiently provided the most effective intervention that led to clinically significant improvement in physical performance?

Answers to these questions require more than self-reports using a standardized form by the students and the clinical instructors; more than graduation from an accredited program; more than performance on a licensure examination. Without information about the student's actual clinical competence, it seems that we have only one portion of the picture. Without this information, how do we know that a student actually delivers an effective plan of care and whether the patient responds? Is it possible to have an excellent clinical instructor who is not an excellent clinician, and, if so, what is the reference standard against which to compare student and clinical instructor self-report? It seems that we need a series of tools to examine the success of clinical education. Why isn’t this a research priority?

I believe that knowledge about how effectively students deliver care during a clinical education experience is essential to engage in a meaningful conversation about the best model for clinical education. I also believe that we need to know how students perform so that we can compare their performance to novice and master clinicians, identify gaps in knowledge and skills, and improve the quality of care delivery. These type of studies are another source of information that should help us define best practice. Of course, my plea for performance measures associated with quality of care assumes that we have all agreed upon and implemented standardization for best practice. So we need one research agenda—and we need all of the researchers at the same table.

A systematic review by Choudhry et al5 serves as an example of the kind of discussions that we need to begin to have in our profession. The investigators examined the relationship between the amount of physician clinical experience and the quality of care. They identified 59 papers that examined quality of care. Studies were characterized into 4 groups based on outcome assessment, that is, knowledge assessment; adherence to standards of care for diagnosis, screening, or prevention; adherence to standards of care for therapy; or health outcomes.

The conclusion was that physicians who practice longer might be at risk for providing lower-quality care. You can imagine the controversy raised by these findings, but at least the question could be asked! In September, Choudhry joined PTJ for a podcast conversation titled “Clinic-Level Factors Affect Quality of Care of Patients with Low Back Pain: What's the Next Step?” (www.ptjournal.org/misc/podcasts.dtl). In this discussion of a research report by Resnik et al,6 Choudry offered insights from the larger health care arena. I urge you all to listen to it.

I am not alone in pleading for the development of meaningful clinical performance assessment. But as you read both Ward's and Delitto's addresses, ask yourself how we are going to address the challenges that they introduce. Tony, thank you for giving us a specific model to consider. Let's now determine whether it is, in fact, a better model.

    • American Physical Therapy Association

    References

    1. ↵
      Ward RS. 2008 APTA Presidential Address: Our great opportunity. Phys Ther. 2008:88:1228–1230.
      OpenUrlFREE Full Text
    2. ↵
      Delitto A. 39th Mary McMillan Lecture: We are what we do. Phys Ther. 2008:88:1219–1227.
      OpenUrlFREE Full Text
    3. ↵
      Embracing Standards in Physical Therapist Clinical Education: A consensus conference on standards in clinical education. December 13–15, 2007; Alexandria, VA. Available at: http://www.apta.org/clinedconsensusdrftstds/.
    4. ↵
      PT CPI Web: Physical Therapist Clinical Performance Instrument. Available at: http://www.ptcpiweb.org.
    5. ↵
      Choudhry NK, Fletcher RH, Soumerai SB. Systematic review: the relationship between clinical experience and quality of health care. Ann Intern Med. 2005;142:260–273.
      OpenUrlCrossRefPubMedWeb of Science
    6. ↵
      Resnik L, Liu D, Mor V, Hart DL. Predictors of physical therapy clinic performance in the treatment of patients with low back pain syndromes. Phys Ther. 2008;88:986–1004.
      OpenUrlFREE Full Text
    View Abstract
    Back to top
    Vol 96 Issue 12 Table of Contents
    Physical Therapy: 96 (12)

    Issue highlights

    • Musculoskeletal Impairments Are Often Unrecognized and Underappreciated Complications From Diabetes
    • Physical Therapist–Led Ambulatory Rehabilitation for Patients Receiving CentriMag Short-Term Ventricular Assist Device Support: Retrospective Case Series
    • Education Research in Physical Therapy: Visions of the Possible
    • Predictors of Reduced Frequency of Physical Activity 3 Months After Injury: Findings From the Prospective Outcomes of Injury Study
    • Use of Perturbation-Based Gait Training in a Virtual Environment to Address Mediolateral Instability in an Individual With Unilateral Transfemoral Amputation
    • Effect of Virtual Reality Training on Balance and Gait Ability in Patients With Stroke: Systematic Review and Meta-Analysis
    • Effects of Locomotor Exercise Intensity on Gait Performance in Individuals With Incomplete Spinal Cord Injury
    • Case Series of a Knowledge Translation Intervention to Increase Upper Limb Exercise in Stroke Rehabilitation
    • Effectiveness of Rehabilitation Interventions to Improve Gait Speed in Children With Cerebral Palsy: Systematic Review and Meta-analysis
    • Reliability and Validity of Force Platform Measures of Balance Impairment in Individuals With Parkinson Disease
    • Measurement Properties of Instruments for Measuring of Lymphedema: Systematic Review
    • myMoves Program: Feasibility and Acceptability Study of a Remotely Delivered Self-Management Program for Increasing Physical Activity Among Adults With Acquired Brain Injury Living in the Community
    • Application of Intervention Mapping to the Development of a Complex Physical Therapist Intervention
    Email

    Thank you for your interest in spreading the word on JCORE Reference.

    NOTE: We only request your email address so that the person you are recommending the page to knows that you wanted them to see it, and that it is not junk mail. We do not capture any email address.

    Enter multiple addresses on separate lines or separate them with commas.
    Does Clinical Education Need a Series of Tools to Assess Success?
    (Your Name) has sent you a message from JCORE Reference
    (Your Name) thought you would like to see the JCORE Reference web site.
    Print
    Does Clinical Education Need a Series of Tools to Assess Success?
    Rebecca L Craik
    Physical Therapy Oct 2008, 88 (10) 1106-1108; DOI: 10.2522/ptj.2008.88.10.1106

    Citation Manager Formats

    • BibTeX
    • Bookends
    • EasyBib
    • EndNote (tagged)
    • EndNote 8 (xml)
    • Medlars
    • Mendeley
    • Papers
    • RefWorks Tagged
    • Ref Manager
    • RIS
    • Zotero
    Save to my folders

    Share
    Does Clinical Education Need a Series of Tools to Assess Success?
    Rebecca L Craik
    Physical Therapy Oct 2008, 88 (10) 1106-1108; DOI: 10.2522/ptj.2008.88.10.1106
    del.icio.us logo Digg logo Reddit logo Technorati logo Twitter logo CiteULike logo Connotea logo Facebook logo Google logo Mendeley logo
    • Tweet Widget
    • Facebook Like
    • Google Plus One
    • Article
      • References
    • Info & Metrics
    • PDF

    Related Articles

    Cited By...

    More in this TOC Section

    • The Revolving Hospital Door
    • Meeting the Challenge of the High-Need, High-Cost Population
    • Partnering With Oxford University Press
    Show more Editorials

    Subjects

    Footer Menu 1

    • menu 1 item 1
    • menu 1 item 2
    • menu 1 item 3
    • menu 1 item 4

    Footer Menu 2

    • menu 2 item 1
    • menu 2 item 2
    • menu 2 item 3
    • menu 2 item 4

    Footer Menu 3

    • menu 3 item 1
    • menu 3 item 2
    • menu 3 item 3
    • menu 3 item 4

    Footer Menu 4

    • menu 4 item 1
    • menu 4 item 2
    • menu 4 item 3
    • menu 4 item 4
    footer second
    footer first
    Copyright © 2013 The HighWire JCore Reference Site | Print ISSN: 0123-4567 | Online ISSN: 1123-4567
    advertisement bottom
    Advertisement Top