Assessing Clinical Skills of Nursing Students: A Triangulation Study to Explore Faculty Experiences and Feedback in Objective Structured Clinical Examination (OSCE)

: Background and aim: Developing clinical skills and its assessment is one of the most important components in nursing education which prepares the student for the reality of practice. Objective structured clinical examination (OSCE) is extensively used and widely accepted by nurse educators across the globe to assess the competency skills of nursing students. The present study aimed at identifying the attitude and perceptions of faculty, and exploring their feedback and experience in conducting OSCE as an assessment tool. Methods: A triangulation research approach was used with convenience sampling. Data collection was carried out using questionnaires and semi-structured interviews. Participants were ten faculty members who were involved in conducting OSCE for students. Results and conclusion : Most of the faculty felt that OSCE reflected the skills of delivery of safe patient care, and the structure reflected mastery of knowledge and skills, which are related to course objectives. OSCE was regarded by the faculty as a consistent, reliable, valid, and objective measure to assess students’ performance and to improve students’ confidence in clinical skills. Concerns were raised about a high level of stress in students, the time required for the proper performance of tasks, OSCE scenarios lacking real-life situations in assessment, and the need for repeated practice and intensive mock training sessions. The applicability of OSCE in terms of limitations in human and material resources with a large number of students would necessitate rethink-ing in developing other assessment strategies to improve the overall process.


Introduction
Developing clinical skills and its assessment is one of the most important components in nursing education which prepares the student for the reality of practice [1]. Nursing educators in various parts of the world have recognized OSCE as an assessment approach and bench-marking to determine participants' level of clinical performance [2][3][4]. Apart from the use of OSCE for assessment purposes, literature suggests that OSCEs are appreciated for its use for reflection of students' performance with identified benefits such as increasing their confidence upon successful completion of OSCE and also as a measure of their clinical learning skills [5].
OSCEs provide a simulated work environment where the students are given short assessment tasks, and are assessed objectively using a pre-determined criteria or checklist [6].
During an OSCE, students are observed and evaluated for their demonstration of clinical skills in a well-planned or structured way, and therefore, it offers particular strengths in terms of objectivity among the examiners as well as reliability and validity in terms of the assessment method [3,7]. Moreover, it is viewed positively by faculty members with elements standardization to reduce the examination bias [3,8].
OSCEs are intrinsically aligned to various curricular components, such as teaching, learning and assessment [8]. They are intended to promote student engagement in behaviors and tasks and in achievement of desired learning outcomes by providing real-life situation prior to clinical practice [6,9,10]. While there is a general agreement in terms of the process of OSCE and its objectivity and validity for evaluating the skills and identifying the gaps and weakness in performance, concerns also were raised that a short-skilled based OSCEs limit a 'whole person' assessment [11,12] thus failing to provide an integrated approach [13]. Some studies have suggested paying attention to test content and design and implementation factors, especially when the results for decision making and process improvement are involved [14].
While OSCEs have a major strength for its higher educational value, it had been regarded to be heavily demanding in terms of faculty time and effort, especially when it comes to assessing a large number of students [13]. It is required, therefore, to examine the perceptions and experiences of faculty in using OSCE as a learning/assessment tool. The purpose of the study was to identify the attitude and perceptions of faculty, and to explore their feedback and experience in conducting OSCE as an assessment tool using a triangulation approach for process improvement.

Design
The study used a triangulation research approach with concurrent data collection using questionnaires and semi-structured interviews. Triangulation is defined as the use of multiple approaches, mainly qualitative and quantitative methods, in studying the same phenomenon, for the purpose of increasing study credibility [12,15,16].

Setting
The study was conducted in the nursing department of a large Government University in Bahrain, which offers various baccalaureate programs including nursing degree programs. OSCE forms a mandatory summative assessment in the second year, during foundations of nursing course.

Participants
The participants of this study were ten nursing faculty who taught the foundations of nursing course. They were included if they were involved in teaching the foundations of nursing course within past two years and had conducted OSCE.

Sampling and Data Collection
A convenience sampling method was used in this study. An explanation of the study procedure was carried out by two research team members for the faculty and their voluntary consent was taken. The study procedure was approved by University Scientific Research Ethical Committee (BADA/175/2017) dated 21 May 2017. The data was collected during the month of June 2017.

Data Collection
The quantitative data collection was carried out using a structured questionnaire consisting of 15 items. The demographic questions included information on gender and 3-items related to teaching: years of experience in teaching nursing, experience in conducting OSCE, and the courses in which they have conducted OSCE earlier. The remaining 12-items were pertaining to faculty perceptions and experience with OSCE and feedback, distributed on a three-point Likert scale (disagree: 1; neutral: 2; agree: 3). The Cronbach's alpha of the given questionnaire was 0.797. The questionnaire was distributed once the OSCE had begun and collected after one week.
For collecting qualitative data, a face-to-face interview was conducted with 10 faculty members by two of the researchers on a fixed time three to four days after the OSCE completion. A semi-structured interview questionnaire with a total of 20 questions was used. The questions were pertaining to overall experience of faculty in conducting OSCE, their feedback and comments including its advantages and disadvantages, OSCEs impact on students' learning and performance of patient assessment skills, and the applicability of OSCE in other courses in nursing with clinical components. The questionnaire was developed using the literature review on the best available literatures in this area and also with the consultation of the experts in qualitative research. The interviews had taken place in a private room for almost a week, each lasting for about 90-120 minutes, which was recorded using audio-tapes.

Data Analysis
Statistical analysis for quantitative variables in this study was done using SPSS software version 20. Descriptive data analyses were undertaken using frequencies, percentages, mean and standard deviation and Spearman's correlation. Qualitative data analysis was performed using Colaizzi's seven-step model of condensing the data [17]. An initial analysis was performed by all the four researchers who independently reviewed the data. Written notes were made and themes were identified from transcribed verbatim. Clustering of the themes was made after careful exploration of the data with identified meanings, condensing the meaning units and formulating categories. The data was interpreted using a triangulation approach and categorized by blending various components of quantitative and qualitative responses. The main themes emerged from the interview response of the faculty were identified alongside literature evidence to illustrate key issues, with categorization of themes and its organization.

Results
Ten faculty members had responded to the questionnaire. Among them, one was male, and the remaining were female. A majority of them had above 16 years of teaching experience in nursing (Mean = 2.60 ± 0.69 years) and 50 percent of them had ≥ 5 years of experience in conducting OSCE (Mean = 1.5 ± 0.53 years) ( Table 1).

Theme One: Faculty Perceptions and Experience of OSCE
Most of the faculty had a positive perception of OSCE for assessing students' competencies. They conveyed their experiences in conducting OSCE, the various components of OSCE, and its advantages and disadvantages. The entire faculty reported that OSCE reflected the skills of delivery of safe patient care (n =10, 100 percent, Mean = 3.0 ± 0.00). Ninety percent of them reported that the practices were more likely to be encountered in clinical practice (Mean= 2.9 ± 0.32).
"Conducting OSCE was an interesting and rich experience, assessing students' competencies in a well-structured manner….it helps in direct observation of students and makes us more vigilant during procedures".
"The students can practice in a stress-free environment…without the pressure of a real patient and the fear of facing the consequences of a poorly-performed care".
"Undergoing an OSCE is a very different experience…the practices are more or less like dealing with real-life patients' encounters in clinical situations".
Faculty also responded that as beginners in clinical area, OSCE gives a controlled environment for students, to perform their clinical skills and to increase their confidence levels.
"OSCE helps to prepare student before entering the real clinical situation… students could focus on learning things before their actual clinical exposure".
"OSCE fine tunes students' skills and confidence levels in carrying out the procedures…while assessing the clinical competencies, it identifies weaker students and the need for improvement".
The entire faculty reported that OSCE's structure reflected mastery of knowledge and skills (n =10, 100 percent, Mean = 3.0 ± 0.00). Positive statements from the faculty included OSCE's reflection in many skills which are related to course objectives, such as communication and critical thinking skills.
"OSCE helps to apply their communication and problem-solving techniques which they have acquired during the course of practice".
"OSCE enables students to incorporate their critical thinking, ethical and professional decision-making skills learned from the clinical setting".
Nearly 70 percent of the faculty agreed that OSCE scenarios were based on commonly encountered situations (Mean = 2.5 ± 0.85). But a few of them felt that the scenariosbased learning in OSCE lacked the reality of practice.
"OSCE reflects realistic situation-based skill assessment…the scenarios are relevant to the clinical situations that they come across in everyday practice".
"OSCE is like a pseudo assessment…. the scenarios are fictitious…it is not a real-life situation for the students to perform".

Theme two: Impact of OSCE on the Assessment of Students' Performance and Development
A majority of them reported that OSCE was consistent and reliable to assess students' performance (90 percent, Mean = 2.9 ± 0.32). Many of the faculty appreciated the OSCE's impact on enhancing students' competence as well as on increasing their confidence for future professional practice.
"It was my first experience in conducting OSCE, and it ensured a valid, objective and reliable evaluation of student's performance…".
"OSCE helps the students in learning complex skills significant to the practice…it improves their ability and confidence in their performance and competence, in dealing with real-life patients".
The faculty had explained about how the OSCE measures the knowledge and skills accurately. Most of them articulated that specific guidelines and the checklist used for the examination helped to assess the students objectively.
"OSCE assesses student's ability to handle unpredictable patient behavior using clear enumerated steps and guidelines provided in the checklists…the viva voce objectively assesses students' knowledge about the scientific principles involved with patient care procedures".
"OSCE checklist reflects all the components such as interpersonal, technical, critical thinking, based on a specific situation…. it's very comprehensive and assesses all the three domains learned…cognitive, affective and psychomotor".
Eighty percent of the faculty felt that OSCE helps the students to execute the tasks in an integrated way (Mean = 2.8 ± 0.42). All the faculty agreed that OSCE feedback helped in students' development (n= 10, 100 percent, Mean = 3.0 ± 0.00). Two of the faculty expressed a concern about the high level of stress in students and felt that the undue exam anxiety prevents assessing students' knowledge.
"OSCE helps students to apply their knowledge and skills in an integrated way…..it prepares the students for a 'whole person' assessment".
"The feedback increases students' confidence, especially when it is positive….it helps them to reflect on their strengths and to identify their weaknesses".
"The students go through a very high level of stress during the examination which can prevent proper measurement of students' knowledge and skills and lead to inaccurate evaluation".

Theme Three: Process of Conducting OSCE
The summary of findings is given in Table 2. The sub-themes emerged included: preparation for OSCE and its management, structure and administration of OSCE, adequacy of physical resources and infrastructure.

Theme four: The Need for Change in OSCE and Suggestions for Improvement
The summary of findings from theme four is given in Table 3. The sub-themes evolved included: suggestions for improvement in the management, delivery, and infrastructure and resources of OSCE.

Theme five: Challenges involved in OSCE Appraisal
While the majority of the faculty agreed that in OSCE there is objectivity in assessment, some of them felt that it aided the students to recall the procedure. Nearly sixty percent of the faculty felt that OSCE was a waste of time and resources and burdensome to students if teachers are not adequately prepared (n= 6, Mean = 2.3 ± 0.94). The entire faculty reported that OSCE requires a lot of passion and self-motivation from students and it requires a lot of preparation and mock training sessions (n= 10, 100 percent, Mean = 3.0 ± 0.00, each).
"Students memorize things and they learn it only for the sake of exam. They are not passionate in achieving the main intended outcomes of the course or the skills…".
"Conducting OSCE is a challenge for students and staff….it requires training at the organizational level to ensure its effectiveness and it needs many resources" "OSCE involves many skills…. the students need to have a lot of preparation and take enormous effort for skill development…. they should have one or two mock sessions".
One of the faculty felt that OSCE is done only for assessing the basic skills in nursing, which has to be changed. Two of the faculty stated that conducting OSCE is a very tedious experience, especially when it comes to assessing a large number of students and suggested to have other trained examiners.
"I feel there is a need to apply OSCE in assessing other nursing clinical competencies". "It was my first involvement with OSCE, and I feel shattered and exhausted about the whole thing… it is very strenuous and demanding for the faculty". "OSCE is time consuming; especially when it comes to appraising large number of students … there should be more examiners to conduct the exam so that there is no overload for the faculty".
The overall drawbacks perceived by faculty in the OSCE assessment are shown in Figure 1. The entire faculty commented that detailed instructions, both verbal and written, were given prior to OSCE assessment about the process of examination. Explanations were given using a power point presentation, prepared in agreement with the entire faculty who were involved with teaching the course. In addition, a mock training session was carried out before the OSCE.
"The students were given the checklist for the procedure and are instructed about the time frame of the exam, along with a discussion session to clarify their doubts. Moreover, a two hours orientation was carried out with clear guidelines about the exam".
"A mock rehearsal session for the exam was carried out prior to conducting OSCE, to acquaint the students to the method of examination".
While many of the faculty felt that the OSCE is well-structured and administered to assess the competencies, some of them had commented that it missed the 'analyzing component' of the case scenario given.
"Yes, I think the exam is well-structured and uses properly designed evaluation tools to assess the students' competencies at the beginner level".
"The OSCE is well-conducted, except for the analyzing part of the case by the students".
Some of the faculty felt that an inter-rater variability in the OSCE would affect students' evaluation, whereas a majority of them reflected that it would not affect the evaluation.
"To some extent the inter-rater variability can affect the evaluation, especially if they are not much experienced in conducting OSCE".
"If the students are rated through a wellstructured checklist and a conscious rating by the faculty, it will not influence the evaluation".
A majority of the faculty (60 percent, Mean = 2.2 ± 1.03) stated that the physical resources were adequate, including the labs, while others suggested improving the facilities by providing enough space and organization of lab so that it gave a better working environment.
Most of them stated that the students were able to engage in practice sessions in the skill lab (90 percent, Mean = 2.8 ± 0.63).
"The labs are well-equipped for the students to practice…. the students are free to engage in self-practice sessions at their own pace…" "The lab resources are insufficient…it needs to be advanced with effective simulators…..require having more space, equipment and skill lab facilities so that the complete examination improves…".
Note: OSCE-Objective structured clinical examination One of the faculty felt that OSCE has to be incorporated as a formative assessment rather than integrating it into the summative assessment of the course.

OSCE needs to be used as a formative assessment method instead of a summative one".
Some faculty suggested in improving the assessment criteria and reviewing the process of conducting OSCE.
"I suggest in improving the assessment criteria and reviewing the process to help students in patient management....".
Faculty suggested that conducting mock drilling sessions and OSCE assessment prior to clinical practice instead of an end-of-training assessment could help the students and improves the whole process of OSCE.
"Improving the scenarios to make it more realistic and providing intensive mock training sessions would better prepare the stu-dents…" "I suggest in providing a psychological preparation to help the students to better perform their skills….".
"Implementing OSCE prior to the clinical experience gives students more confidence in their real encounters with patients …" Few of the faculty recommended in providing a relaxed environment for students and considering the time in performing skills while conducting the exam.
"The time in performing the skills need to be considered…also a relaxed environment provided may help the students to collect their thoughts and perform in a more desirable way".
"The infrastructure in terms of environment, space, OSCE stations, equipments and its arrangement should be done to create a feel of real-life situation…the number of stations in the same room can be increased".
One faculty suggested that a clinicalbased examination with the involvement of an external examiner would be helpful for improving the assessment process. There was a suggestion from another faculty to provide training for the examiners and to give immediate feedback for the students after the exam.

"Clinical examination with a trained external examiner is the best…. OSCE can be applied if a state of art lab is provided with a lab technician… also I suggest to have a well-developed OSCE checklist which is piloted, before it is applied on a uniform basis"
"Providing training for the examiners in conducting the exam will help in examin-ers…. providing immediate feedback to the students after the exam would improve fairness and an insight to the whole process and the reality of practice".
Some of the faculty felt that the time factor was a major disadvantage for employing proper practice sessions for students. Many of them suggested for improving the lab hours.
"The students need to have more practice time while demonstrating complex skills…. the time has to be extended for each procedure in the lab while learning…" "The lab hours need to be rearranged with enough practice, so that we could examine the students well. It also helps the students to do more self-practice sessions".
"Students can be provided with videos for repeated practice sessions which can be tailor-made as per the necessity of the student".

Theme six: Applicability of OSCE in Other Courses of BSN Curriculum
Faculty opinion varied when they were asked about the applicability of OSCE in other courses in the BSN curriculum. Some of them suggested that it could be applicable only in few courses.
"Since OSCE is an objective method of assessment, it would be useful if it is applied to majority of courses with a clinical component… but it has to be studied well before its implementation". "It might be good to apply as a comprehensive exam to other courses, provided it is organized well and equipped with adequate resources….".
However, two faculties recommended that it could not be applied to other courses.
"I am foreseeing the applicability of OSCE in only adult health nursing course, but not in any other courses".

Discussion
The findings from the present study indicate that faculty members generally perceived OSCE with a positive attitude and underlined its value, despite it being taxing and time consuming.
Our study results were similar to the study conducted by Mitchell, Henderson [18], where it was recommended to be used as a safe method of assessing students' performance and for applicability of knowledge into practice. The clinical skill laboratories can provide a realistic and safe environment for the student by allowing enough freedom to practice, and to make errors, compared to real clinical environment where it can be trivial [1,19]. However, it was also recommended that OSCE need to be used in conjunction with other relevant student evaluation methods [20].
With regard to the likelihood of OSCE practices encountered in clinical practice, our study results were in contrast to the research findings of Houghton, Casey [1], where the OSCE requries a lot of passion and self-motivation from students use of mannequins were rated by the participants and clinical staff as unrealistic, due to the limitation in developing communication skills as well as performing certain procedures, compared to real patients. While hands-on-learning in the clinical area is better compared to lab setting [12], skill laboratories can be a means of an alternative clinical learning exposure to develop students' confidence without compromising patient's safety [20] and when the actual clinical exposure is limited.
Most of the faculty agreed that OSCE scenarios were based on commonly encountered situations, whereas in another study, the students reported the need for OSCE performance to match with clinical situations and resolve inconsistencies [21]. A high level of similarity and relevance to actual clinical practice situation had been reported by Mitchell, Henderson [12], which had also found to increase the level of confidence of students. Studies conducted by Salinitri, O'Connell [22] and by Hatamleh and Sabeeb [23] had reported similar results, where OSCE was perceived by the students as highly realistic and relevant to actual clinical practice situations to assess problem-based learning. Use of standardized patients for students' assessment is practiced in some places [24]. However, replicating the complexities of clinical practice in a simulated environment is challenging and difficult [6,24].
Providing feedback to the students in an OSCE about their performance can be motivating by providing an opportunity for self-development by identifying the individual areas of deficits, [4,21]. Solà, Pulpón [25] recommended OSCE as a multi-method evaluation strategy for student assessment which can provide a high-impact training if provided with adequate feedback. Feedback also helped the students to recognize that achievement of overall competencies is obligatory to clinical practice [21] and found to increase their satisfaction [13]. Student preparation is another means to reduce their overwhelming stress experienced with the examination process [26] and entire faculty reported that OSCE requires a lot of passion and self-motivation from students. It was similar to the study conducted by Arab and Boker [27]. It can include techniques such as introduction to the assessment format and expected clinical skills of OSCE, offering advice and answering questions related to its administration, providing a virtual learning environment and scheduling practice sessions to refine specific skills [26].
Similar to our study results, OSCE was reported in previous research as being highly demanding, expensive and time consuming, and thereby requiring more investment in terms of preparation and organization [20]. Even though it is suggested that having 'more mock training' sessions can minimize the negative perceptions related to this type of assessment, it can be unfeasible due to lack of time in curriculum grid, shortage of physical resources, very busy schedule of faculty and a large number of students [13].
In the present study, variability of clinical examiners during assessment of students' competencies was recognized as a crucial element and an adverse factor affecting the reliability [28]. A critical review of the OSCE test content and design including scientific standard setting procedures [26], getting involved in teaching a skill prior to its evaluation [2], training raters, having a panel of assessors to evaluate examinees, providing extra time for skill assessment are suggested in some research studies to increase reliability and validity [14,26,27]. Fatigue affecting examiners also has been reported as an adverse factor affecting reliability in some studies [4,29].
Our main limitation in this study was a small sample size, due to involving participants who taught and conducted OSCE assessment.
But its major strength is that it gives a broader perspective of the experiences of the faculty members in OSCE assessment and the challenges they faced while they performed the assessment.

Conclusion
The results of the present study support the view that OSCE is a valid and reliable method of assessment of student competencies. However, this study highlights the need to rethink in terms of the practicality and applicability of this assessment strategy, especially when there is a limitation in human and material sources to improve the overall process.