Ready for a challenge? Need a change? Looking for an alternate method to evaluate students' clinical performance? That's exactly what an adventuresome team of professors within the college diploma nursing program did during the past academic year. Niagara College's directive to explore and examine alternate teaching-learning approaches provided the impetus to develop and pilot the Objective Structured Clinical Evaluation (O.S.C.E.) method. In addition, the continued shrinking of clinical resources in the institutional setting and the increased need to use community resources with reduced clinical supervision created the need for an avenue to examine students' clinical progress in a controlled setting.
O.S.C.E. is an objective method using clearly defined learning outcomes and a predetermined rating scale. This form of testing has been widely used in medical education (Harden, Stevenson, Downie and Wilson, 1975; Harden and Gleeson, 1979; Petrusa, Blackwell and Ainsworth, 1986). It has also been used in several Canadian university nursing programs as one of the tools to evaluate student clinical performance (McKnight et al., 1987; Ross et al., 1988). As well, the ambulance program at Niagara has been using this format successfully to test ambulance students at the end of the academic year.
Clinical evaluation of students is always an area of controversy and concern. Students often register complaints of variation in teacher expectation and of subjectivity in grading, hence the need for a more objective form of evaluation using programmed clients in a controlled environment which directly simulates the reality of the clinical setting.
A task force consisting of three professors and a laboratory technician met weekly during the fall semester to plan, develop, test and pilot the O.S.C.E. method. Essential Term III clinical learning outcomes were identified from course content. This was done with the assistance of the classroom theory professor. The list was reviewed to ensure that the outcomes were representative of and clearly reflected the standards of nursing practice (College of Nurses of Ontario, 1991) and covered all aspects of the nursing process. The list of skills to be tested was revised to make the tasks manageable and a time-line for completion of tasks was established.
Essential learning outcomes were separated into those which could either be tested by a written quiz or by skill performance. Objectives for each learning outcome were written, scenarios developed, student and set up instructions outlined, and criteria for rating determined. The efforts of the laboratory technician in preparing equipment, organizing set ups, and directing those assisting in the actual O.S.C.E. process were phenomenal.
Term III professors were kept informed of the progress and their suggestions and comments were welcomed. Once completed, the Term III professors were asked to share the objectives with students and to participate with the clinical students in a trial run of two scenarios. These were: how to assess the healing of a surgical wound, and how to provide appropriate caloric replacement for a client with diabetes who had not fully consumed all items on a lunch tray. Classmates exchanged roles of clients, raters and participants for each of the two scenarios, thereby giving each student the opportunity to experience an O.S.C.E. prior to deciding to volunteer for the O.S.C.E. pilot. The use of at least two raters for each skill performance station was essential to ensure inter-rater reliability and, hence, objectivity.
Students volunteering to participate in the actual O.S.C.E. pilot were assured their clinical grade for the term would not be influenced by their performance on the O.S.C.E. This relieved some of the anxiety always present when performing a skill in a timed setting while being closely observed.
The day of the O.S.C.E. pilot, for project volunteers, consisted of an orientation setting, a rotation around ten timed skill stations, a refreshment break, a written quiz and a debriefing session. During the debriefing session students were shown the criteria for rating a skill performance and allowed to question the rationale and/or verbalize their feelings about the experience. An evaluation of the experience by participants concluded the pilot for students.
Senior student volunteers assisted with the O.S.C.E. experience as programmed clients with preselected conditions requiring nurse intervention, or as the second rater along with a professor at skill stations. This was an exciting learning experience for them. They also participated in the preparation of the set ups, monitoring stations or guiding students along the established route. Credit for clinical time was given to those volunteering to participate in the O.S.C.E. as simulated clients or raters and to those being examined for skill performance.
Student responses were mixed. Those assisting in the experience wished they had been allowed to circulate to more stations to observe student performances. Those participating as examinees felt it was a worthwhile experience, albeit anxiety producing. Most wanted immediate feedback and/or some coaching along the way. They suggested the incorporation of the problem situation scenarios into regular laboratory sessions.
Student scores on the skill performance rating scales were helpful to identify student and program strengths and areas requiring greater emphasis. Areas such as health teaching related to drug administration and the assessment of client health conditions were identified as needing strengthening.
The task force felt the pilot project was an overwhelming success. The materials used for the pilot project as well as the results were assembled for students and faculty as a resource for future use. Areas needing revision were noted, such as the need for expanded orientation of raters, and the need for more clearly defined instructions of where to start a procedure.
In an effort to share the experience and the results with other faculty using clinical resources and needing alternate teaching-learning approaches, a follow up experiential workshop presentation of the O.S.C.E. was made at Humber College early in June. Our client, Betty Ann, who needed post-operative assessment, was a third member of the task force. Use of the O.S.C.E. for prior learning assessment was a potential clearly identified. Incorporation of the O.S.C.E. into the newly developed nursing program in regular laboratory experience is being instituted.
Further developmental work is needed in determining the O.S.C.E.'s place as a determinant of the students' clinical grade. The task of developing, using and presenting the O.S.C.E. certainly was challenging, exciting and rewarding - a welcome alternative and change in approach for clinical evaluation.
College of Nurses of Ontario. (1991). Standards of nursing practice for registered nurses and registered nursing assistants. (Toronto: Author).
Harden, R., Stevenson, M., Downie, W., and Wilson, G. (1975). Assessment of clinical competence using objective structured examination (O.S.C.E.). Medical Education, 13, 41-54.
Harden, R., and Gleeson, F. (1979). Assessment of clinical competence using objective structured examination. British Medical Journal, 1, 447-451.
McKnight, J., Rideout, E., Brown, B., Ciliska, D., Patton, D., Rankin, J., and Woodward, C. (1987). The objective structured clinical examination: An alternative approach to assessing student clinical performance. Journal of Nursing Education, 26 (1), 39-41.
Petrusa, P. Blackwell, T., and Ainsworth, M. (1986). Performance of interal medicine house officers on a short station. O.S.C.E. Research Paper, University of Texas.
Ross, M., Carroll, G., Knight, J., Chamberlain, M., Fothergill-Bourbonnais, F., and Linton, J. (1988). Using the O.S.C.E. to measure clinical skills performance in nursing. Journal of Advanced Nursing, 13, 45-56.
Linda Willms and Mary Sirotnik are the Co-ordinators of Nursing at Niagara College in Welland, Ontario.