Across the Foundation’s priority areas, our grantees are working to improve the health of the public through innovative research and programs. The Foundation awards up to 40 grants on a rotating schedule each year.
Innovative Tools for Evaluating Interprofessional Competencies
To more rigorously assess the interprofessional competencies of students who have complemented this course, UCLA will develop a set of assessment tools that will evaluate interprofessional competencies in the classroom and clinical practice settings. Tools will include an iPad app designed to allow instructional leaders, such as coaches and mentors, to assess actual collaborative practices through observations and walk-throughs in clinical IPE settings.
Researchers will take advantage of the “natural experiment” that half of the medical students take this course interprofessionally and half do not, to measure changes in students’ knowledge, skills, attitudes and behaviors.
The researchers will test and refine the assessment tools, exploring the usefulness, feasibility and reliability of each. With the help of a national advisory group, the tools will then be broadly disseminated. By sharing these tools with the wider community, UCLA hopes to increase the ability of interprofessional education programs to assess learner outcomes and evaluate program effectiveness, and expand evidence of the effectiveness of IPE in enhancing learners’ perceptions, knowledge and skills for collaborative practice.
Interprofessional education is taking root across the country, but we need to more rigorously evaluate the effectiveness of these programs and assess learner outcomes so that we can understand and replicate what works best. That’s what UCLA is working on.
“At UCLA, we’ve been bringing nursing and medical students together for five years. But one of the biggest frustrations we’ve faced is that the only tools for evaluation that are out there are based on self-reporting,” said LuAnn Wilkerson, EdD, senior associate dean for medical education and professor of medicine at UCLA. “While helpful in program planning, self-reporting tools alone didn’t take us as far as we wanted to go. We want to know what has changed in terms of students’ knowledge, skills, behaviors, experiences, and judgment,” she said.
With a grant from the Macy Foundation, an interprofessional team of investigators from medicine, nursing and evaluation sciences are developing six new innovative tools to evaluate interprofessional competencies.
“Doctors and nurses have worked together for hundreds of years, but because we have historically worked and been educated in silos we’ve created barriers to effective collaboration and teamwork,’’ said Courtney Lyder, RN, dean of the UCLA School of Nursing, and Co-PI on the study. “With the take-up of interprofessional education, we’re beginning to break down those silos. The tools we are creating at UCLA will help us understand how to do it better,” he said.
“We wanted to focus on being able to contribute new tools for assessing student collaboration and not a new way of bringing them together,” said Wilkerson.
The set of tools include the following:
1. Knowledge Test. Based on the IPEC Core Competencies for Interprofessional Collaborative Practice, this set of 49 questions and answers can be customized for use in a variety of settings. The test consists of 10 questions on collaborative practice, 18 on team work, and 21 on roles and responsibilities, including systems-based practice questions in areas such as quality improvement and patient safety. The tool can be used as a pre-test before an interprofessional course or experience to assess what students know, and at the end of the program to see how their knowledge has changed.
2. Zaption Video Assessment. This user-friendly tool takes learners through three different interprofessional scenarios that they may encounter in practice—for example, a care team discussing the next steps for a patient who had been participating in a clinical trial. Every so often the action pauses, and viewers must answer a question to assess their judgment, thinking, and interprofessional communication skills. Answers are archived in the tool’s database that allows faculty to evaluate individuals or cohorts of students over time.
3. E-Walk—a direct observation tool. Using an iPad, this “walk through” tool can be used by faculty in a classroom, clinical or simulation setting to assess students’ collaborative practice. It assesses how an individual performs in a team on five aspects: communication, collaboration, roles, client and patient centered care, and conflict. The tool also has a team observation option. Faculty can even print out their reports and share them with the students being observed. The observation rubric uses the TeamSTEPPS model.
4. Objective Structured Clinical Exam Case. In the case, which can be used in an OSCE, a senior medical student and nurse practitioner working in a congestive heart failure outpatient clinic have a patient who has lost his job and can no longer afford the medications that have been prescribed to him. Working together, and using the checklist that is provided, the students must decide what to tell the patient and how best to proceed. This case tests students’ communication and behaviors.
5. Implicit Association Test. This test, based on a tool developed by Harvard’s Project Implicit (Implicit.harvard.edu), helps nurses and physicians in training to identify their stereotypes, beliefs, values, and intrinsic biases that they hold about one another’s roles. Not only does the tool help students recognize their own biases, it also helps them see how those biases may impact their ability to engage in collaborative practice.
6. 360 Degree Evaluation. This tool, based on aspects of the Interprofessional Collaborator Assessment Rubric (Curran, et al), will facilitate evaluation of an individual learner by multiple people who are working on a team with that person.
The UCLA team is currently piloting these tools. In a baseline OSCE case for senior medical students, the team found that the medical students had difficulty connecting with the nurse. Only 24 percent of the medical students asked the nurse for patient-specific information even though he or she had just been with the patient overnight, while 40 percent of the students asked the nurse to complete a task, such as give the patient oxygen or medication.
“What this shows is that when you get closer to real practice, the classroom isn’t translating as much as it should be,” said Wilkerson. “We are seeing wonderful results with communication – students are being clear, direct and respectful, but we are not seeing the two professions work on tasks together or make decisions together. These tools are helping us look at real behavior, and understand how to refine the courses and the exercises we are teaching,” she added.
Wilkerson and Lyder’s work is being guided by a national advisory group that includes faculty members from around the country and the National Center for IPE at the University of Minnesota. They plan to share the assessment tools with the wider community once they have been fully tested.