The Use of a Blended Simulation Model to Increase the Confidence of Non-Clinical Personnel in Performing Clinical Tasks

Clinical Researcher—September 2018 (Volume 32, Issue 8)

PEER REVIEWED

Erin Prettiman, MSN, RN, ACNS-BC; Donamarie N-Wilfong, DNP, RN; Therese Justus McAtee, DNP, RN, CEN, TNCC; Laura Daniel, PhD

 

Vital signs, electrocardiograms (ECGs), and phlebotomy tasks have critical importance as the primary means to detect changes in patient condition and effectiveness of therapies. With such high-stakes decisions and assessments being made from these measurements and tasks, it is expected that they are completed with skilled proficiency. Typically, a practitioner with clinical training and experience, such as a nurse, fulfills these responsibilities. However, due to healthcare practitioner shortages and/or scheduling conflicts, many facilities rely on non-clinical research staff to perform these tasks, many of whom lack adequate knowledge or clinical practice of the procedures.

This article describes how a customized, clinical simulation–based course was designed, developed, and created specifically for the non-clinical audience. In this case study, a large, multi-hospital healthcare network in a metropolitan area lacked an adequate staff of nurses and/or phlebotomists to take vital signs, perform ECGs, and draw blood samples for various clinical trials and research projects. Therefore, leadership alternatively required that research coordinators fulfill these clinical responsibilities.

Most of the available coordinators lacked both healthcare education and previous clinical experience, but instead were trained in the world of business and/or research. Nonetheless the physician investigators of the studies quickly trained the research coordinators using the classic “see one, do one, teach one” method, and handed them a needle as they walked out to greet their first trial subject.

The informal training that the research coordinators received across the healthcare network lacked standardization, and varied greatly with time and with instructor. This off-the-cuff training was quickly deemed insufficient, as the managers reported many research coordinators felt under-prepared and/or anxious about their newfound clinical responsibilities even post-physician training. Therefore, managers from the research centers and the educational leaders from the network’s simulation lab forged a new partnership to create dynamic healthcare training for this unique population of non-clinical personnel as part of their onboarding program. This training was not sanctioned by any hospital or the internal review board.

Methods

The newfound partnership between simulation and research managers allowed for the creation of an innovative educational approach to teach vital sign assessment, ECG tracing, and phlebotomy to this unique population through a blended learning model. This model consisted of didactic teaching followed by hands-on simulations and skill proficiencies using a standardized competency checklist.

Learners were guided through theory during the didactic portion of the course with an extensive PowerPoint lecture and class discussion. Instructors began this session by teaching learners the proper measurement methods of temperature, pulse, blood pressure, pulse oximetry, and pain. The instructors then taught the blood collection system—highlighting the significance of laboratory tests, specific collection tubes, and colors, and the proper procedure to preserve a collected sample, followed by a review of proper ECG placement identifying correct artifact.

Course instructors also weaved clinical documentation topics that highlighted legal implications for inappropriate documentation throughout the discussion. Instructors took great care throughout the lecture to avoid medical jargon and acronyms, assuming learners had no previous healthcare knowledge.

After the lecture, the learners practiced all the skills hands-on, using high-fidelity manikins and state-of-the-art task trainers as many times as they liked. Once they were satisfied with their own practice, they were then evaluated by the course instructors to ensure competency in each of the three domains.

The participants were asked to demonstrate the proper procedure utilizing the requisite equipment to accurately measure and assess predetermined vital signs on the manikins; demonstrate accurate lead placement and tracing on an ECG task trainer; and draw blood samples. Course instructors used standardized competency skills checklists to deem learner competency. Each domain had its own checklist and the number of items varied on each: vital signs (48 items); ECG (20 items); and phlebotomy (28 items). These checklists required the course instructor to initial each item, verifying that she/he deemed the learner competent.

Furthermore, due to its invasive nature, participants were also given the opportunity to draw blood from live patients on clinical floors, under the supervision of an experienced phlebotomist. Learners were only permitted to partake in this experience after the course instructors deemed them competent on the simulators. These patients were research participants with the research institution.

The preceptor in the clinical setting provided learners with practical, timely feedback of their strengths and areas in need of improvement. If the preceptor in the clinical setting found a learner lacking proficiency in the clinical setting based on the competency checklist, that learner would be required to return to the simulation lab for remediation based on the preceptor’s feedback. None of the participants required this retraining, and all participants were permitted to practice independently.

As a final synthesizing exercise, the learners participated in a simulated case study to practice and refine their critical thinking skills needed in clinical research projects. In this simulation, learners role-played an investigative drug study. The learners were asked to attend to a research subject and use their reasoning skills to determine how to accurately document a visit by the subject. This simulation gave the learners invaluable practice with the nuances and intricacies of a non-textbook documentation case.

Results

All course participants were deemed competent by the course instructors. None of the participants had to return to the simulation center for additional practice.

The course’s pre/posttest asked learners (n = 29) to quantify how confident/unconfident they felt in fulfilling 14 job-related tasks on 7-point Likert scales, where higher scores represented more positive responses. These tasks covered the clinical aspects of their job responsibilities: phlebotomy process (5 items), taking vital signs (6 items), and interpreting ECGs (3 items). These items were created and vetted through an interprofessional panel of nurse educator, nurse manager, simulation expert, and psychometrician as they related to current job responsibilities to ensure content validity. Scores were summed across all items to obtain an index confidence score in fulfilling their clinical job responsibilities, with a possible range of 14 to 98. Students’ individual pretest and posttest scores are shown in Figure 1.

 

Figure 1: Confidence Scores as a Function of Student and Administration Time

 

 

All but two students showed growth in clinical confidence after the course and thus all measures of central tendency increased after the course, as shown in Table 1. The variation in self-reported confidence scores also decreased after the training.

 

Table 1: Descriptive Statistics of Confidence Index Before and After Course

Mean

Median

 

Mode

SD

Min

 

Max

Before

54.79

56.00

 

70.00

23.36

14.00

 

98.00

After

86.93

 89.50

 

98.00

15.39

19.00

 

98.00

 

A Wilcoxon Signed Rank test was used to determine if students’ (n = 28) self-reported confidence levels in successfully completing their clinical responsibilities changed after taking the course. Indeed, the test showed that there was a significant difference in the students’ overall confidence levels before and after the course, Z = 4.38, p < .01. In fact, all 14 of the individual items also showed a significant difference in confidence ratings at the  level of significance. SPSS 22.0 was used to conduct the descriptive and inferential statistics.{1}

Conclusion

This formal, competency-based simulation onboarding program for non-clinical personnel assigned to have clinical responsibilities empowered the employees with the competence and confidence needed to perform clinical tasks with proficiency. The educational investment afforded to the employees yielded benefits beyond themselves to the research subjects and to the larger research project. Research subjects experienced greater safety as more competent staff members drew their blood and assessed their vital signs. The research project experienced an increase in the reliability and validity of the data as staff members performed their tasks proficiently and identically.

This program demonstrates the applicability of simulation-based education to non-clinical populations. The blended learning model provided learners with time and education to grasp the theory behind the skills, and with hands-on simulation practice prior to any true research subject encounter. The simulation was self-directed, had immediate relevance to the learners’ jobs, and was problem-centered, thus satisfying preferences of adult learners as stated in Knowles’ 1984 theory of adult learning.{2}

Other hospitals and healthcare networks that are relying on non-clinical personnel to fulfill clinical responsibilities could model this onboarding program in their own institutions. Future research needs to extend this program beyond a single institution—gathering more participants and teasing out relationships between confidence levels and various independent factors, such as experience levels and education.

References

  1. IBM Corp. 2013. IBM SPSS Statistics for Windows, Version 22.0. Armonk, N.Y.
  2. Knowles MS. 1984. The adult learner: A neglected species (3rd ed.). Houston: Gulf.

Erin Prettiman, MSN, RN, ACNS-BC, (eward305@yahoo.com) is an Education and Development Specialist with the Simulation, Teaching, and Academic Research (STAR) Center at the Allegheny Health Network in Pittsburgh, Pa.

Donamarie N-Wilfong, DNP, RN, (DonaMarie.Wilfong@ahn.org) is Vice President of Simulation Education with STAR at the Allegheny Health Network.

Therese Justus McAtee, DNP, RN, CEN, TNCC, (Therese.JUSTUS@ahn.org) is Director of Interprofessional Education with STAR at the Allegheny Health Network.

Laura Daniel, PhD, (lhd613@gmail.com) is a Psychometrician with STAR at the Allegheny Health Network.