World Journal of Nursing Research
Research Article | Open Access | 10.31586/wjnr.2021.105

Assessing Clinical Skills of Nursing Students: A Triangulation Study to Explore Faculty Experiences and Feedback in Objective Structured Clinical Examination (OSCE)

Bindu John1,*, Gayathripriya Narayanan2, Munira Al-Sawad2 and Naseem Saeed Ali2
1
Department of Community Health Nursing, Lisie College of Nursing, Lisie Medical and Educational Institutions, Kaloor, Kochi, India- 682018
2
Nursing Department, College of Health Sciences, University of Bahrain, Kingdom of Bahrain

Abstract

Background and aim: Developing clinical skills and its assessment is one of the most important components in nursing education which prepares the student for the reality of practice. Objective structured clinical examination (OSCE) is extensively used and widely accepted by nurse educators across the globe to assess the competency skills of nursing students. The present study aimed at identifying the attitude and perceptions of faculty, and exploring their feedback and experience in conducting OSCE as an assessment tool. Methods: A triangulation research approach was used with convenience sampling. Data collection was carried out using questionnaires and semi-structured interviews. Participants were ten faculty members who were involved in conducting OSCE for students. Results and conclusion: Most of the faculty felt that OSCE reflected the skills of delivery of safe patient care, and the structure reflected mastery of knowledge and skills, which are related to course objectives. OSCE was regarded by the faculty as a consistent, reliable, valid, and objective measure to assess students’ performance and to improve students’ confidence in clinical skills. Concerns were raised about a high level of stress in students, the time required for the proper performance of tasks, OSCE scenarios lacking real-life situations in assessment, and the need for repeated practice and intensive mock training sessions. The applicability of OSCE in terms of limitations in human and material resources with a large number of students would necessitate rethinking in developing other assessment strategies to improve the overall process.

1. Introduction

Developing clinical skills and its assessment is one of the most important components in nursing education which prepares the student for the reality of practice [1]. Nursing educators in various parts of the world have recognized OSCE as an assessment approach and bench-marking to determine participants’ level of clinical performance [2, 3, 4]. Apart from the use of OSCE for assessment purposes, literature suggests that OSCEs are appreciated for its use for reflection of students’ performance with identified benefits such as increasing their confidence upon successful completion of OSCE and also as a measure of their clinical learning skills [5].

OSCEs provide a simulated work environment where the students are given short assessment tasks, and are assessed objectively using a pre-determined criteria or checklist [6].

During an OSCE, students are observed and evaluated for their demonstration of clinical skills in a well-planned or structured way, and therefore, it offers particular strengths in terms of objectivity among the examiners as well as reliability and validity in terms of the assessment method [3, 7]. Moreover, it is viewed positively by faculty members with elements standardization to reduce the examination bias [3, 8].

OSCEs are intrinsically aligned to various curricular components, such as teaching, learning and assessment [8]. They are intended to promote student engagement in behaviors and tasks and in achievement of desired learning outcomes by providing real-life situation prior to clinical practice [6, 9, 10]. While there is a general agreement in terms of the process of OSCE and its objectivity and validity for evaluating the skills and identifying the gaps and weakness in performance, concerns also were raised that a short-skilled based OSCEs limit a ‘whole person’ assessment [11, 12] thus failing to provide an integrated approach [13]. Some studies have suggested paying attention to test content and design and implementation factors, especially when the results for decision making and process improvement are involved [14].

While OSCEs have a major strength for its higher educational value, it had been regarded to be heavily demanding in terms of faculty time and effort, especially when it comes to assessing a large number of students [13]. It is required, therefore, to examine the perceptions and experiences of faculty in using OSCE as a learning/assessment tool. The purpose of the study was to identify the attitude and perceptions of faculty, and to explore their feedback and experience in conducting OSCE as an assessment tool using a triangulation approach for process improvement.

2. Materials and Methods

2.1. Design

The study used a triangulation research approach with concurrent data collection using questionnaires and semi-structured interviews. Triangulation is defined as the use of multiple approaches, mainly qualitative and quantitative methods, in studying the same phenomenon, for the purpose of increasing study credibility [12, 15, 16].

2.2. Setting

The study was conducted in the nursing department of a large Government University in Bahrain, which offers various baccalaureate programs including nursing degree programs. OSCE forms a mandatory summative assessment in the second year, during foundations of nursing course.

2.3. Participants

The participants of this study were ten nursing faculty who taught the foundations of nursing course. They were included if they were involved in teaching the foundations of nursing course within past two years and had conducted OSCE.

2.4. Sampling and Data Collection

A convenience sampling method was used in this study. An explanation of the study procedure was carried out by two research team members for the faculty and their voluntary consent was taken. The study procedure was approved by University Scientific Research Ethical Committee (BADA/175/2017) dated 21 May 2017. The data was collected during the month of June 2017.

2.5. Data Collection

The quantitative data collection was carried out using a structured questionnaire consisting of 15 items. The demographic questions included information on gender and 3-items related to teaching: years of experience in teaching nursing, experience in conducting OSCE, and the courses in which they have conducted OSCE earlier. The remaining 12- items were pertaining to faculty perceptions and experience with OSCE and feedback, distributed on a three-point Likert scale (disagree: 1; neutral: 2; agree: 3). The Cronbach’s alpha of the given questionnaire was 0.797. The questionnaire was distributed once the OSCE had begun and collected after one week.

For collecting qualitative data, a face-to-face interview was conducted with 10 faculty members by two of the researchers on a fixed time three to four days after the OSCE completion. A semi-structured interview questionnaire with a total of 20 questions was used. The questions were pertaining to overall experience of faculty in conducting OSCE, their feedback and comments including its advantages and disadvantages, OSCEs impact on students’ learning and performance of patient assessment skills, and the applicability of OSCE in other courses in nursing with clinical components. The questionnaire was developed using the literature review on the best available literatures in this area and also with the consultation of the experts in qualitative research. The interviews had taken place in a private room for almost a week, each lasting for about 90-120 minutes, which was recorded using audio-tapes.

2.7. Data Analysis

Statistical analysis for quantitative variables in this study was done using SPSS software version 20. Descriptive data analyses were undertaken using frequencies, percentages, mean and standard deviation and Spearman’s correlation. Qualitative data analysis was performed using Colaizzi’s seven-step model of condensing the data [17]. An initial analysis was performed by all the four researchers who independently reviewed the data. Written notes were made and themes were identified from transcribed verbatim. Clustering of the themes was made after careful exploration of the data with identified meanings, condensing the meaning units and formulating categories. The data was interpreted using a triangulation approach and categorized by blending various components of quantitative and qualitative responses. The main themes emerged from the interview response of the faculty were identified alongside literature evidence to illustrate key issues, with categorization of themes and its organization.

3. Results

Ten faculty members had responded to the questionnaire. Among them, one was male, and the remaining were female. A majority of them had above 16 years of teaching experience in nursing (Mean = 2.60 ± 0.69 years) and 50 percent of them had ≥ 5 years of experience in conducting OSCE (Mean = 1.5 ± 0.53 years) (Table 1).

3.1. Theme One: Faculty Perceptions and Experience of OSCE

Most of the faculty had a positive perception of OSCE for assessing students’ competencies. They conveyed their experiences in conducting OSCE, the various components of OSCE, and its advantages and disadvantages. The entire faculty reported that OSCE reflected the skills of delivery of safe patient care (n =10, 100 percent, Mean = 3.0 ± 0.00). Ninety percent of them reported that the practices were more likely to be encountered in clinical practice (Mean= 2.9 ± 0.32).

“Conducting OSCE was an interesting and rich experience, assessing students’ competencies in a well-structured manner….it helps in direct observation of students and makes us more vigilant during procedures”.

“The students can practice in a stress-free environment…without the pressure of a real patient and the fear of facing the consequences of a poorly-performed care”.

“Undergoing an OSCE is a very different experience…the practices are more or less like dealing with real-life patients’ encounters in clinical situations”.

Faculty also responded that as beginners in clinical area, OSCE gives a controlled environment for students, to perform their clinical skills and to increase their confidence levels.

“OSCE helps to prepare student before entering the real clinical situation… students could focus on learning things before their actual clinical exposure”.

“OSCE fine tunes students’ skills and confidence levels in carrying out the procedures…while assessing the clinical competencies, it identifies weaker students and the need for improvement”.

The entire faculty reported that OSCE’s structure reflected mastery of knowledge and skills (n =10, 100 percent, Mean = 3.0 ± 0.00). Positive statements from the faculty included OSCE’s reflection in many skills which are related to course objectives, such as communication and critical thinking skills.

“OSCE helps to apply their communication and problem-solving techniques which they have acquired during the course of practice”.

“OSCE enables students to incorporate their critical thinking, ethical and professional decision-making skills learned from the clinical setting”.

Nearly 70 percent of the faculty agreed that OSCE scenarios were based on commonly encountered situations (Mean = 2.5 ± 0.85). But a few of them felt that the scenarios-based learning in OSCE lacked the reality of practice.

“OSCE reflects realistic situation-based skill assessment…the scenarios are relevant to the clinical situations that they come across in everyday practice”.

“OSCE is like a pseudo assessment…. the scenarios are fictitious…it is not a real-life situation for the students to perform”.

3.2. Theme two: Impact of OSCE on the Assessment of Students’ Performance and Development

A majority of them reported that OSCE was consistent and reliable to assess students’ performance (90 percent, Mean = 2.9 ± 0.32). Many of the faculty appreciated the OSCE’s impact on enhancing students’ competence as well as on increasing their confidence for future professional practice.

“It was my first experience in conducting OSCE, and it ensured a valid, objective and reliable evaluation of student’s performance…”.

“OSCE helps the students in learning complex skills significant to the practice…it improves their ability and confidence in their performance and competence, in dealing with real-life patients”.

The faculty had explained about how the OSCE measures the knowledge and skills accurately. Most of them articulated that specific guidelines and the checklist used for the examination helped to assess the students objectively.

“OSCE assesses student’s ability to handle unpredictable patient behavior using clear enumerated steps and guidelines provided in the checklists…the viva voce objectively assesses students’ knowledge about the scientific principles involved with patient care procedures”.

“OSCE checklist reflects all the components such as interpersonal, technical, critical thinking, based on a specific situation…. it’s very comprehensive and assesses all the three domains learned…cognitive, affective and psychomotor”.

Eighty percent of the faculty felt that OSCE helps the students to execute the tasks in an integrated way (Mean = 2.8 ± 0.42). All the faculty agreed that OSCE feedback helped in students’ development (n= 10, 100 percent, Mean = 3.0 ± 0.00). Two of the faculty expressed a concern about the high level of stress in students and felt that the undue exam anxiety prevents assessing students’ knowledge.

“OSCE helps students to apply their knowledge and skills in an integrated way…..it prepares the students for a ‘whole person’ assessment”.

“The feedback increases students’ confidence, especially when it is positive….it helps them to reflect on their strengths and to identify their weaknesses”.

“The students go through a very high level of stress during the examination which can prevent proper measurement of students’ knowledge and skills and lead to inaccurate evaluation”.

3.3. Theme Three: Process of Conducting OSCE

The summary of findings is given in Table 2. The sub-themes emerged included: preparation for OSCE and its management, structure and administration of OSCE, adequacy of physical resources and infrastructure.

3.4. Theme four: The Need for Change in OSCE and Suggestions for Improvement

The summary of findings from theme four is given in Table 3. The sub-themes evolved included: suggestions for improvement in the management, delivery, and infrastructure and resources of OSCE.

3.5. Theme five: Challenges involved in OSCE Appraisal

While the majority of the faculty agreed that in OSCE there is objectivity in assessment, some of them felt that it aided the students to recall the procedure. Nearly sixty percent of the faculty felt that OSCE was a waste of time and resources and burdensome to students if teachers are not adequately prepared (n= 6, Mean = 2.3 ± 0.94). The entire faculty reported that OSCE requires a lot of passion and self-motivation from students and it requires a lot of preparation and mock training sessions (n= 10, 100 percent, Mean = 3.0 ± 0.00, each).

“Students memorize things and they learn it only for the sake of exam. They are not passionate in achieving the main intended outcomes of the course or the skills…”.

“Conducting OSCE is a challenge for students and staff….it requires training at the organizational level to ensure its effectiveness and it needs many resources”

“OSCE involves many skills…. the students need to have a lot of preparation and take enormous effort for skill development…. they should have one or two mock sessions”.

One of the faculty felt that OSCE is done only for assessing the basic skills in nursing, which has to be changed. Two of the faculty stated that conducting OSCE is a very tedious experience, especially when it comes to assessing a large number of students and suggested to have other trained examiners.

“I feel there is a need to apply OSCE in assessing other nursing clinical competencies”.

“It was my first involvement with OSCE, and I feel shattered and exhausted about the whole thing… it is very strenuous and demanding for the faculty”.

“OSCE is time consuming; especially when it comes to appraising large number of students … there should be more examiners to conduct the exam so that there is no overload for the faculty”.

The overall drawbacks perceived by faculty in the OSCE assessment are shown in Figure 1.

3.6. Theme six: Applicability of OSCE in Other Courses of BSN Curriculum

Faculty opinion varied when they were asked about the applicability of OSCE in other courses in the BSN curriculum. Some of them suggested that it could be applicable only in few courses.

“Since OSCE is an objective method of assessment, it would be useful if it is applied to majority of courses with a clinical component… but it has to be studied well before its implementation”. “It might be good to apply as a comprehensive exam to other courses, provided it is organized well and equipped with adequate resources….”.

However, two faculties recommended that it could not be applied to other courses.

“I am foreseeing the applicability of OSCE in only adult health nursing course, but not in any other courses”.

4. Discussion

The findings from the present study indicate that faculty members generally perceived OSCE with a positive attitude and underlined its value, despite it being taxing and time consuming.

Our study results were similar to the study conducted by Mitchell, Henderson [18], where it was recommended to be used as a safe method of assessing students’ performance and for applicability of knowledge into practice. The clinical skill laboratories can provide a realistic and safe environment for the student by allowing enough freedom to practice, and to make errors, compared to real clinical environment where it can be trivial [1, 19]. However, it was also recommended that OSCE need to be used in conjunction with other relevant student evaluation methods [20].

With regard to the likelihood of OSCE practices encountered in clinical practice, our study results were in contrast to the research findings of Houghton, Casey [1], where the use of mannequins were rated by the participants and clinical staff as unrealistic, due to the limitation in developing communication skills as well as performing certain procedures, compared to real patients. While hands-on-learning in the clinical area is better compared to lab setting [12], skill laboratories can be a means of an alternative clinical learning exposure to develop students’ confidence without compromising patient’s safety [20] and when the actual clinical exposure is limited.

Most of the faculty agreed that OSCE scenarios were based on commonly encountered situations, whereas in another study, the students reported the need for OSCE performance to match with clinical situations and resolve inconsistencies [21]. A high level of similarity and relevance to actual clinical practice situation had been reported by Mitchell, Henderson [12], which had also found to increase the level of confidence of students. Studies conducted by Salinitri, O'Connell [22] and by Hatamleh and Sabeeb [23] had reported similar results, where OSCE was perceived by the students as highly realistic and relevant to actual clinical practice situations to assess problem-based learning. Use of standardized patients for students’ assessment is practiced in some places [24]. However, replicating the complexities of clinical practice in a simulated environment is challenging and difficult [6, 24].

Providing feedback to the students in an OSCE about their performance can be motivating by providing an opportunity for self-development by identifying the individual areas of deficits, [4, 21]. Solà, Pulpón [25] recommended OSCE as a multi-method evaluation strategy for student assessment which can provide a high-impact training if provided with adequate feedback. Feedback also helped the students to recognize that achievement of overall competencies is obligatory to clinical practice [21] and found to increase their satisfaction [13]. Student preparation is another means to reduce their overwhelming stress experienced with the examination process [26] and entire faculty reported that OSCE requires a lot of passion and self-motivation from students. It was similar to the study conducted by Arab and Boker [27]. It can include techniques such as introduction to the assessment format and expected clinical skills of OSCE, offering advice and answering questions related to its administration, providing a virtual learning environment and scheduling practice sessions to refine specific skills [26].

Similar to our study results, OSCE was reported in previous research as being highly demanding, expensive and time consuming, and thereby requiring more investment in terms of preparation and organization [20]. Even though it is suggested that having ‘more mock training’ sessions can minimize the negative perceptions related to this type of assessment, it can be unfeasible due to lack of time in curriculum grid, shortage of physical resources, very busy schedule of faculty and a large number of students [13].

In the present study, variability of clinical examiners during assessment of students’ competencies was recognized as a crucial element and an adverse factor affecting the reliability [28]. A critical review of the OSCE test content and design including scientific standard setting procedures [26], getting involved in teaching a skill prior to its evaluation [2], training raters, having a panel of assessors to evaluate examinees, providing extra time for skill assessment are suggested in some research studies to increase reliability and validity [14, 26, 27]. Fatigue affecting examiners also has been reported as an adverse factor affecting reliability in some studies [4, 29].

Our main limitation in this study was a small sample size, due to involving participants who taught and conducted OSCE assessment.

But its major strength is that it gives a broader perspective of the experiences of the faculty members in OSCE assessment and the challenges they faced while they performed the assessment.

5. Conclusion

The results of the present study support the view that OSCE is a valid and reliable method of assessment of student competencies. However, this study highlights the need to rethink in terms of the practicality and applicability of this assessment strategy, especially when there is a limitation in human and material sources to improve the overall process.

Acknowledgements

We would like to thank Ms. Muyassar Sabri, Coordinator, BSN Program in our college and all the faculty who participated in this study. We would like to extend our appreciation towards Dr. Elizabeth Samson, Reader and Professor, Rajiv Gandhi Institute of Technology, Kerala, India, for her tremendous effort in editing the paper.

References

  1. Houghton CE, Casey D, Shaw D, Murphy K. Staff and students' perceptions and experiences of teaching and assessment in Clinical Skills Laboratories: interview findings from a multiple case study. Nurse Education Today. 2012 Aug 1;32(6): e29-34. doi: 10.1016/j.nedt.2011.10.005.[CrossRef] [PubMed]
  2. Byrne E, Smyth S. Lecturers’ experiences and perspectives of using an objective structured clinical examination. Nurse Education in Practice. 2008 Jul 1;8(4):283-9. doi: 10.1016/j.nepr.2007.10.001.[CrossRef] [PubMed]
  3. Rushforth HE. Objective structured clinical examination (OSCE): review of literature and implications for nursing education. Nurse education today. 2007 Jul 1;27(5):481-90. doi: 10.1016/j.nedt.2006.08.009.[CrossRef] [PubMed]
  4. Selim AA, Ramadan FH, El-Gueneidy MM, Gaafer MM. Using Objective Structured Clinical Examination (OSCE) in undergraduate psychiatric nursing education: is it reliable and valid? Nurse education today. 2012 Apr 1;32(3):283-8. doi: 10.1016/j.nedt.2011.04.006.[CrossRef] [PubMed]
  5. Yoo MS, Yoo IY, Lee H. Nursing students’ self-evaluation using a video recording of foley catheterization: Effects on students’ competence, communication skills, and learning motivation. Journal of Nursing Education. 2010 Jul 1;49(7):402-5. doi: 10.3928/01484834-20100331-03.[CrossRef] [PubMed]
  6. Major DA. OSCEs–seven years on the bandwagon: The progress of an objective structured clinical evaluation programme. Nurse Education Today. 2005 Aug 1;25(6):442-54. doi: 10.1016/j.nedt.2005.03.010.[CrossRef] [PubMed]
  7. Brosnan M, Evans W, Brosnan E, Brown G. Implementing objective structured clinical skills evaluation (OSCE) in nurse registration programmes in a centre in Ireland: A utilisation focused evaluation. Nurse education today. 2006 Feb 1;26(2):115-22. doi: 10.1016/j.nedt.2005.08.003.[CrossRef] [PubMed]
  8. Nulty DD, Mitchell ML, Jeffrey CA, Henderson A, Groves M. Best practice guidelines for use of OSCEs: maximising value for student learning. Nurse education today. 2011 Feb 1;31(2):145-51. doi: 10.1016/j.nedt.2010.05.006.[CrossRef] [PubMed]
  9. El-Nemer A, Kandeel N. Using OSCE as an assessment tool for clinical skills: nursing students' feedback. Australian Journal of basic and Applied sciences. 2009;3(3):2465-72. doi: 10.12691/ajnr-8-2-11.[CrossRef]
  10. Meyers NM, Nulty DD. How to use (five) curriculum design principles to align authentic learning environments, assessment, students’ approaches to thinking and learning outcomes. Assessment & Evaluation in Higher Education. 2009 Oct 1;34(5):565-77. https://doi.org/10.1080/02602930802226502.[CrossRef]
  11. Crossley J, Jolly B. Making sense of work‐based assessment: ask the right questions, in the right way, about the right things, of the right people. Medical education. 2012 Jan;46(1):28-37. doi: 10.1111/j.1365-2923.2011.04166.x.[CrossRef] [PubMed]
  12. Mitchell ML, Henderson A, Jeffrey C, Nulty D, Groves M, Kelly M, Knight S, Glover P. Application of best practice guidelines for OSCEs—an Australian evaluation of their feasibility and value. Nurse education today. 2015 May 1;35(5):700-5. doi: 10.1016/j.nedt.2015.01.007.[CrossRef] [PubMed]
  13. Troncon LE. Clinical skills assessment: limitations to the introduction of an" OSCE"(Objective Structured Clinical Examination) in a traditional Brazilian medical school. Sao Paulo Medical Journal. 2004; 122:12-7. doi: 10.1590/s1516-​31802004000100004.[CrossRef] [PubMed]
  14. Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Fam Med. 2008 Sep 1;40(8):574-8.
  15. Hussein A. The use of triangulation in social sciences research. Journal of comparative social work. 2009 Apr 1;4(1):106-17. https://doi.org/10.31265/jcsw.v4i1.48.[CrossRef]
  16. Creswell JW. Steps in conducting a scholarly mixed methods study.2013. DBER Speaker Series. 48. https://digitalcommons.unl.edu/dberspeakers/48[Source Link]
  17. Morrow R, Rodriguez A, King N. Colaizzi’s descriptive phenomenological method. The psychologist. 2015;28(8):643. http://eprints.hud.ac.uk/id/eprint/26984/1/Morrow_et_al.pdf[Source Link]
  18. Mitchell ML, Henderson A, Groves M, Dalton M, Nulty D. The objective structured clinical examination (OSCE): optimising its value in the undergraduate nursing curriculum. Nurse education today. 2009 May 1;29(4):398-404. doi: 10.1016/j.nedt.2008.10.007.[CrossRef] [PubMed]
  19. Yap K, Bearman M, Thomas N, Hay M. Clinical psychology students’ experiences of a pilot objective structured clinical examination. Australian Psychologist. 2012 Sep 1;47(3):165-73. https://doi.org/10.1111/j.1742-9544.2012.00078.x.[CrossRef]
  20. Smrekar M, Ledinski Fičko S, Hošnjak AM, Ilić B. Use of the objective structured clinical examination in undergraduate nursing education. Croatian Nursing Journal. 2017 Dec 18;1(1):91-102. doi: 10.24141/2/1/1/8.[CrossRef]
  21. John S, Deshkar AM. Evaluation Of Objective Structured Clinical Examination (OSCE): Physiotherapy Student's Perception. National Journal of Integrated Research in Medicine. 2014 May 1;5(3). doi:​10.1186/1472-6920-4-22 2.[CrossRef] [PubMed]
  22. Salinitri FD, O'Connell MB, Garwood CL, Lehr VT, Abdallah K. An objective structured clinical examination to assess problem-based learning. American journal of pharmaceutical education. 2012 Apr 10;76(3). doi: 10.5688/ajpe76344.[CrossRef] [PubMed]
  23. Hatamleh W, Abu Sabeeb Z. Perception of Nursing Faculty Members on the Use of Objective Structured Clinical Examinations (OSCE) To Evaluate Competencies. OSR Journal of Nursing and Health Science (IOSR-JNHS) Volume.;3:21-6. http://www.iosrjournals.org/iosr-jnhs/papers/vol3-issue6/Version-4/E03642126.pdf[CrossRef]
  24. Mavis BE, Ogle KS, Lovell KL, Madden LM. Medical students as standardized patients to assess interviewing skills for pain evaluation. Medical education. 2002 Feb;36(2):135-40. doi: 10.1046/j.1365-2923.2002.​01070.x.[CrossRef] [PubMed]
  25. Traynor M, Galanouli D. Have OSCEs come of age in nursing education? British Journal of Nursing. 2015 Apr 9;24(7):388-91. doi: 10.12968/bjon.2015.24.7.388.[CrossRef] [PubMed]
  26. Solà M, Pulpón AM, Morin V, Sancho R, Clèries X, Fabrellas N. Towards the implementation of OSCE in undergraduate nursing curriculum: A qualitative study. Nurse education today. 2017 Feb 1; 49:163-7. doi: 10.1016/j.nedt.2016.11.028[CrossRef] [PubMed]
  27. Arab AA, Boker AM. Experience with the objective structured clinical Examination in saudi board Exam of anesthesia. resident’s and Examiner’s perspectives. Medical education journal of anesthesia.2018 25(1): 77-86.
  28. Alsaid AH, Al-Sheikh M. Student and faculty perception of objective structured clinical examination: A teaching hospital experience. Saudi journal of medicine & medical sciences. 2017 Jan;5(1):49. doi: 10.4103/1658-631X.194250[CrossRef] [PubMed]
  29. McLaughlin K, Ainslie M, Coderre S, Wright B, Violato C. The effect of differential rater function over time (DRIFT) on objective structured clinical examination ratings. Medical education. 2009 Oct;43(10):989-92. doi: 10.1111/j.1365-2923.2009.​03438.x.
Article metrics
Views
1122
Downloads
249
Citations
0

How to Cite

John, B., Narayanan, G., Al-Sawad, M., & Saeed Ali, N. (2021). Assessing Clinical Skills of Nursing Students: A Triangulation Study to Explore Faculty Experiences and Feedback in Objective Structured Clinical Examination (OSCE). World Journal of Nursing Research, 1(1). Retrieved from https://www.scipublications.com/journal/index.php/wjnr/article/view/105

Copyright

Copyright © 2023 by authors and Science Publications. This is an open access article and the related PDF distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

  1. Houghton CE, Casey D, Shaw D, Murphy K. Staff and students' perceptions and experiences of teaching and assessment in Clinical Skills Laboratories: interview findings from a multiple case study. Nurse Education Today. 2012 Aug 1;32(6): e29-34. doi: 10.1016/j.nedt.2011.10.005.[CrossRef] [PubMed]
  2. Byrne E, Smyth S. Lecturers’ experiences and perspectives of using an objective structured clinical examination. Nurse Education in Practice. 2008 Jul 1;8(4):283-9. doi: 10.1016/j.nepr.2007.10.001.[CrossRef] [PubMed]
  3. Rushforth HE. Objective structured clinical examination (OSCE): review of literature and implications for nursing education. Nurse education today. 2007 Jul 1;27(5):481-90. doi: 10.1016/j.nedt.2006.08.009.[CrossRef] [PubMed]
  4. Selim AA, Ramadan FH, El-Gueneidy MM, Gaafer MM. Using Objective Structured Clinical Examination (OSCE) in undergraduate psychiatric nursing education: is it reliable and valid? Nurse education today. 2012 Apr 1;32(3):283-8. doi: 10.1016/j.nedt.2011.04.006.[CrossRef] [PubMed]
  5. Yoo MS, Yoo IY, Lee H. Nursing students’ self-evaluation using a video recording of foley catheterization: Effects on students’ competence, communication skills, and learning motivation. Journal of Nursing Education. 2010 Jul 1;49(7):402-5. doi: 10.3928/01484834-20100331-03.[CrossRef] [PubMed]
  6. Major DA. OSCEs–seven years on the bandwagon: The progress of an objective structured clinical evaluation programme. Nurse Education Today. 2005 Aug 1;25(6):442-54. doi: 10.1016/j.nedt.2005.03.010.[CrossRef] [PubMed]
  7. Brosnan M, Evans W, Brosnan E, Brown G. Implementing objective structured clinical skills evaluation (OSCE) in nurse registration programmes in a centre in Ireland: A utilisation focused evaluation. Nurse education today. 2006 Feb 1;26(2):115-22. doi: 10.1016/j.nedt.2005.08.003.[CrossRef] [PubMed]
  8. Nulty DD, Mitchell ML, Jeffrey CA, Henderson A, Groves M. Best practice guidelines for use of OSCEs: maximising value for student learning. Nurse education today. 2011 Feb 1;31(2):145-51. doi: 10.1016/j.nedt.2010.05.006.[CrossRef] [PubMed]
  9. El-Nemer A, Kandeel N. Using OSCE as an assessment tool for clinical skills: nursing students' feedback. Australian Journal of basic and Applied sciences. 2009;3(3):2465-72. doi: 10.12691/ajnr-8-2-11.[CrossRef]
  10. Meyers NM, Nulty DD. How to use (five) curriculum design principles to align authentic learning environments, assessment, students’ approaches to thinking and learning outcomes. Assessment & Evaluation in Higher Education. 2009 Oct 1;34(5):565-77. https://doi.org/10.1080/02602930802226502.[CrossRef]
  11. Crossley J, Jolly B. Making sense of work‐based assessment: ask the right questions, in the right way, about the right things, of the right people. Medical education. 2012 Jan;46(1):28-37. doi: 10.1111/j.1365-2923.2011.04166.x.[CrossRef] [PubMed]
  12. Mitchell ML, Henderson A, Jeffrey C, Nulty D, Groves M, Kelly M, Knight S, Glover P. Application of best practice guidelines for OSCEs—an Australian evaluation of their feasibility and value. Nurse education today. 2015 May 1;35(5):700-5. doi: 10.1016/j.nedt.2015.01.007.[CrossRef] [PubMed]
  13. Troncon LE. Clinical skills assessment: limitations to the introduction of an" OSCE"(Objective Structured Clinical Examination) in a traditional Brazilian medical school. Sao Paulo Medical Journal. 2004; 122:12-7. doi: 10.1590/s1516-​31802004000100004.[CrossRef] [PubMed]
  14. Turner JL, Dankoski ME. Objective structured clinical exams: a critical review. Fam Med. 2008 Sep 1;40(8):574-8.
  15. Hussein A. The use of triangulation in social sciences research. Journal of comparative social work. 2009 Apr 1;4(1):106-17. https://doi.org/10.31265/jcsw.v4i1.48.[CrossRef]
  16. Creswell JW. Steps in conducting a scholarly mixed methods study.2013. DBER Speaker Series. 48. https://digitalcommons.unl.edu/dberspeakers/48[Source Link]
  17. Morrow R, Rodriguez A, King N. Colaizzi’s descriptive phenomenological method. The psychologist. 2015;28(8):643. http://eprints.hud.ac.uk/id/eprint/26984/1/Morrow_et_al.pdf[Source Link]
  18. Mitchell ML, Henderson A, Groves M, Dalton M, Nulty D. The objective structured clinical examination (OSCE): optimising its value in the undergraduate nursing curriculum. Nurse education today. 2009 May 1;29(4):398-404. doi: 10.1016/j.nedt.2008.10.007.[CrossRef] [PubMed]
  19. Yap K, Bearman M, Thomas N, Hay M. Clinical psychology students’ experiences of a pilot objective structured clinical examination. Australian Psychologist. 2012 Sep 1;47(3):165-73. https://doi.org/10.1111/j.1742-9544.2012.00078.x.[CrossRef]
  20. Smrekar M, Ledinski Fičko S, Hošnjak AM, Ilić B. Use of the objective structured clinical examination in undergraduate nursing education. Croatian Nursing Journal. 2017 Dec 18;1(1):91-102. doi: 10.24141/2/1/1/8.[CrossRef]
  21. John S, Deshkar AM. Evaluation Of Objective Structured Clinical Examination (OSCE): Physiotherapy Student's Perception. National Journal of Integrated Research in Medicine. 2014 May 1;5(3). doi:​10.1186/1472-6920-4-22 2.[CrossRef] [PubMed]
  22. Salinitri FD, O'Connell MB, Garwood CL, Lehr VT, Abdallah K. An objective structured clinical examination to assess problem-based learning. American journal of pharmaceutical education. 2012 Apr 10;76(3). doi: 10.5688/ajpe76344.[CrossRef] [PubMed]
  23. Hatamleh W, Abu Sabeeb Z. Perception of Nursing Faculty Members on the Use of Objective Structured Clinical Examinations (OSCE) To Evaluate Competencies. OSR Journal of Nursing and Health Science (IOSR-JNHS) Volume.;3:21-6. http://www.iosrjournals.org/iosr-jnhs/papers/vol3-issue6/Version-4/E03642126.pdf[CrossRef]
  24. Mavis BE, Ogle KS, Lovell KL, Madden LM. Medical students as standardized patients to assess interviewing skills for pain evaluation. Medical education. 2002 Feb;36(2):135-40. doi: 10.1046/j.1365-2923.2002.​01070.x.[CrossRef] [PubMed]
  25. Traynor M, Galanouli D. Have OSCEs come of age in nursing education? British Journal of Nursing. 2015 Apr 9;24(7):388-91. doi: 10.12968/bjon.2015.24.7.388.[CrossRef] [PubMed]
  26. Solà M, Pulpón AM, Morin V, Sancho R, Clèries X, Fabrellas N. Towards the implementation of OSCE in undergraduate nursing curriculum: A qualitative study. Nurse education today. 2017 Feb 1; 49:163-7. doi: 10.1016/j.nedt.2016.11.028[CrossRef] [PubMed]
  27. Arab AA, Boker AM. Experience with the objective structured clinical Examination in saudi board Exam of anesthesia. resident’s and Examiner’s perspectives. Medical education journal of anesthesia.2018 25(1): 77-86.
  28. Alsaid AH, Al-Sheikh M. Student and faculty perception of objective structured clinical examination: A teaching hospital experience. Saudi journal of medicine & medical sciences. 2017 Jan;5(1):49. doi: 10.4103/1658-631X.194250[CrossRef] [PubMed]
  29. McLaughlin K, Ainslie M, Coderre S, Wright B, Violato C. The effect of differential rater function over time (DRIFT) on objective structured clinical examination ratings. Medical education. 2009 Oct;43(10):989-92. doi: 10.1111/j.1365-2923.2009.​03438.x.