Advertisement

Methods of assessment used by osteopathic educational institutions

Published:August 27, 2012DOI:https://doi.org/10.1016/j.ijosm.2012.07.002

      Abstract

      Background

      The methods used for assessment of students in osteopathic teaching institutions are not widely documented in the literature. A number of commentaries around clinical competency assessment have drawn on the health professional assessment literature, particularly in medicine.

      Objective

      To ascertain how osteopathic teaching institutions assess their students and to identify issues associated with the assessment process.

      Design

      A series of focus groups and interviews was undertaken with osteopathic teaching institutions.

      Participants

      Twenty-five participants across eleven osteopathic teaching institutions from the United Kingdom, Canada, Italy and Australia.

      Results

      Four themes were identified from the focus groups: Assessing; Processes; Examining; Cost Efficiency. Institutions utilised assessment types such as multiple choice questions and written papers in the early years of a program and progressed towards the long case assessment and Objective Structured Clinical Examination in the later stages of a program. Although examiner cost and training were common themes across all of the institutions, they were perceived to be necessary for developing and conducting assessments.

      Conclusion

      Most institutions relied on traditional assessment methods such as the long case assessment, however, there is increasing recognition of newer forms of assessment, such as the portfolio. The assessment methods employed were typically written assessments in the early years of a program, progressing to long case and Objective Structured Clinical Examination format assessments.

      Keywords

      To read this article in full you will need to make a payment

      Purchase one-time access:

      Academic and Personal

      Subscribe:

      Subscribe to International Journal of Osteopathic Medicine
      Already a print subscriber? Claim online access
      Already an online subscriber? Sign in
      Institutional Access: Sign in to ScienceDirect

      References

        • Australian & New Zealand Osteopathic Council
        Accreditation policy: standards and procedures for the accreditation of osteopathic courses in Australia.
        2010 (Melbourne, Australia)
        • Boud D.
        • Associates
        Assessment 2020: seven propositions for assessment reform in higher education.
        Australian Learning and Teaching Council, Sydney2010
        • London S.
        The assessment of clinical practice in osteopathic education: Is there a need to define a gold standard?.
        Int J Osteopath Med. 2008; 11: 132-136
        • Miller G.E.
        The assessment of clinical skills/competence/performance.
        Acad Med. 1990; 65: 910-917
        • Davis M.
        OSCE: the Dundee experience.
        Med Teach. 2003; 25: 255-261
        • Abbey H.
        Assessing clinical competence in osteopathic education: analysis of outcomes of different assessment strategies at the British School of Osteopathy.
        Int J Osteopath Med. 2008; 11: 125-131
        • Fletcher P.
        Clinical competence examination – improvement of validity and reliability.
        Int J Osteopath Med. 2008; 11: 137-141
        • Vaughan B.
        • Sullivan V.
        • Gosling C.
        • McLaughlin P.
        • Fryer G.
        • Wolff M.
        • et al.
        Assessment of overseas-qualified osteopaths for the suitabilty to practice in Australia.
        Victoria University, Melbourne, Australia2010
        • Wolcott H.F.
        Writing up qualitative research.
        3rd ed. Sage Publications, USA2009
        • Biddle S.J.H.
        • Markland D.
        • Gilbourne D.
        • Chatzisarantis N.L.D.
        • Sparkes A.C.
        Research methods in sport and exercise psychology: quantitative and qualitative issues.
        J Sports Sci. 2001; 19: 777-809
        • Miles M.B.
        • Huberman A.M.
        An expanded source book: qualitative data anlysis.
        2nd ed. Sage Publications, California, USA1994
        • Harrell M.C.
        • Bradley M.A.
        Data collection methods: semi-structured interviews and focus groups.
        RAND Corporation, Santa Monica2009
        • Rushforth H.E.
        Objective structured clinical examination (OSCE): review of literature and implications for nursing education.
        Nurse Educ Today. 2007; 27: 481-490
        • Olson L.G.
        The ability of a long-case assessment in one discipline to predict students' performances on long-case assessments in other disciplines.
        Acad Med. 1999; 74: 835-839
        • Hijazi Z.
        • Premadasa I.G.
        • Moussa M.A.A.
        Performance of students in the final examination in paediatrics: importance of the "short cases".
        Arch Dis Child. 2002; 86: 57-58
        • Ponnamperuma G.G.
        • Karunathilake I.M.
        • McAleer S.
        • Davis M.H.
        The long case and its modifications: a literature review.
        Med Educ. 2009; 43: 936-941
        • Norman G.R.
        The long case versus objective structured clinical examinations: the long case is a bit better, if time is equal.
        Br Med J. 2002; 324: 748-749
        • Boulet J.R.
        • Rebbecchi T.A.
        • Denton E.C.
        • McKinley D.W.
        • Whelan G.P.
        Assessing the written communication skills of medical school graduates.
        Adv Health Sci Educ. 2004; 9: 47-60
        • Probert C.S.
        • Cahill D.J.
        • McCann G.L.
        • Ben-Shlomo Y.
        Traditional finals and OSCEs in predicting consultant and self-reported clinical skills of PRHOs: a pilot study.
        Med Educ. 2003; 37: 597-602
        • Hays R.B.
        • Davies H.A.
        • Beard J.D.
        • Caldon L.J.M.
        • Farmer E.A.
        • Finucane P.M.
        • et al.
        Selecting performance assessment methods for experienced physicians.
        Med Educ. 2002; 36: 910-917
        • Wass V.
        • van der Vleuten C.P.M.
        • Shatzer J.
        • Jones R.
        Assessment of clinical competence.
        Lancet. 2001; 357: 945-949
        • Gleeson F.
        AMEE medical education guide No. 9. Assessment of clinical competence using the objective structured long examination recorder (OSLER).
        Med Teach. 1997; 19: 7
        • McCrorie P.
        • Boursicot K.
        Variations in medical school graduating examinations in the United Kingdom: are clinical competence standards comparable?.
        Med Teach. 2009; 31: 223-229
        • Fromme H.B.
        • Karani R.
        • Downing S.M.
        Direct observation in medical education: review of the literature and evidence for validity.
        Mt Sinai J Med. 2009; 76: 365-371
        • Collins J.P.
        • Gamble G.D.
        A multi-format interdisciplinary final examination.
        Med Educ. 1996; 30: 259-265
        • Norcini J.
        The death of the long case?.
        Br Med J. 2002; 324: 408
        • Wilkinson T.J.
        • Campbell P.J.
        • Judd S.J.
        Reliability of the long case.
        Med Educ. 2008; 42: 887-893
        • Olson L.G.
        • Coughlan J.
        • Rolfe I.
        • Hensley M.J.
        The effect of a Structured Question Grid on the validity and perceived fairness of a medical long case assessment.
        Med Educ. 2000; 34: 46-52
        • Benning T.
        • Broadhurst M.
        The long case is dead–Long live the long case.
        Psych Bull. 2007; 31: 441-442
        • Wass V.
        • Jones R.
        • van der Vleuten C.P.M.
        Standardised or real patients to test clinical competence? The long case revisited.
        Med Educ. 2001; 35: 321-325
        • Kaslow N.J.
        • Grus C.L.
        • Campbell L.F.
        • Fouad N.A.
        • Hatcher R.L.
        • Rodolfa E.R.
        Competency assessment Toolkit for professional psychology.
        Train and tion ssional chology. 2009; 3: s27-s45
        • Anderson D.
        • Gardner G.
        • Ramsbotham J.
        • Tones M.
        E-portfolios: developing nurse practitioner competence and capability.
        Aust J Adv Nurs. 2009; 26: 70-76
        • Driessen E.
        • Norman G.
        Are learning portfolios worth the effort?.
        Br Med J. 2008; 337: 320-321
        • Stone C.
        • Boud D.
        • Hager P.
        Assessment of osteopaths: developing a capability-based approach to reviewing readiness to practice.
        Int J Osteopath Med. 2011; 14: 129-140
        • Sadler D.R.
        Indeterminancy in the use of preset criteria for assessment and grading.
        Assess Eval Higher Educ. 2009; 34: 159-179
        • Boulet J.R.
        Summative assessment in medicine: the promise of simulation for high-stakes evaluation.
        Acad Emerg Med. 2008; 15: 1017-1024
        • Webb E.
        • Endacott R.
        • Gray M.A.
        • Jasper M.A.
        • McMullan M.
        • Scholes J.
        Evaluating portfolio assessment systems: what are the appropriate criteria?.
        Nurse Educ Today. 2003; 23: 600-609
        • Doyle J.D.
        • Webber E.M.
        • Sidhu R.S.
        A universal global rating scale for the evaluation of technical skills in the operating room.
        Am J Surg. 2007; 193: 551-555
        • Boud D.
        Creating a work-based curriculum.
        in: Boud D. Solomon N. Work-based learning: a new higher education. SRHE & Open University Press, Buckingham2001
        • Southgate L.
        • Hays R.B.
        • Norcini J.
        • Mulholland H.
        • Ayers B.
        • Woolliscroft M.
        • et al.
        Setting performance standards for medical practice: a theoretical framework.
        Med Educ. 2001; 35: 474-481