Leveraging Institutional Tools and Resources to Track Clinical and Translational Research Services

Clinical Researcher—February 2018 (Volume 32, Issue 2)

PEER REVIEWED

Beatrice A. Boateng, PhD; Amy Jenkins, MS, CCRP, CCRC, CCRA; Anthony McGuire; Rhonda Jorden, MBA; Laura P. James, MD

[DOI: 10.14524/CR-17-0027]

Among the many challenges they face when conducting clinical and translational research, investigators must write detailed protocols, ensure compliance with a wide range of regulatory issues, and seek input and assistance from specialists such as biostatisticians, community engagement experts, and others. Less experienced or early-stage investigators may not know where to go within an institution to receive help in planning and conducting studies.

With support from a Clinical and Translational Science Award (CTSA) received in 2009, the University of Arkansas for Medical Sciences (UAMS) established the Translational Research Institute (TRI) and consolidated research services across the campus. In doing so, it became apparent that the institution did not have a robust process for tracking usage of research services, which is important for efficiently providing research support to the clinical and translational research enterprise.

The goal was therefore 1) to make it easier for researchers to find the help they need, and 2) to develop a system that allows the continuous monitoring of requests in order to efficiently provide the services.

Background

In the third quarter of 2014, TRI project leaders developed a process for tracking the type of services needed by researchers. This was initially setup as an electronic form that included contact information; project title; institutional review board (IRB) identification (ID) number; menu of services available (and a general option for services not available on the menu); a brief description of the request; and the ability to upload related documentation. The form was embedded into the TRI website, and investigators who called or e-mailed a request were re-directed to request services through the form.

While the researchers simply completed a form, the research services coordinator had to manually enter the information into a Microsoft Access® database. Although the process was time consuming and manually intensive, it provided baseline information on the type of research services needed by UAMS researchers. It also allowed the institution to build capacity in the areas that were lacking.

In 2015, TRI conducted an internal evaluation of the process and determined that it did not adequately capture the entire scope of, or all the research services utilized through, a single request. Problematic areas included:

  • One form completed by a researcher could lead to the utilization of multiple research services, and this was not tracked by the process in place.
  • It was challenging to easily determine which services were over-utilized or under-utilized, which would allow resources to be shifted to over-utilized but under-resourced service areas.
  • The institution rarely captured timely researcher feedback on their experiences and satisfaction interacting with service providers.

The TRI project leaders subsequently worked with key stakeholders to identify, categorize, and create workflows for the 25 research services available through TRI. Recognizing the need for continuous engagement of key stakeholders, monthly meetings were held to update stakeholders on the development of the system. TRI then explored tools within the institution that could be modified for its needs and be integrated into existing databases.

The goal was to standardize processes and provide a single gateway for requesting research services. TRI also wanted to better track the utilization and impact of the services offered. The solution, therefore, had to have two parts: the researcher view that enabled researchers to easily request services, and the administrative view that allowed TRI to track requests in real time.

Identifying the Right Tool for the Job

From the researcher perspective, TRI wanted to continue to have a simple form that allowed researchers to easily request services without entering redundant information, such as their contact details. The desire was for researchers to provide enough information to minimize miscommunication on the type of services needed, and to allow the service area to process the request.

From the administrative point of view, TRI envisioned a system that could capture real-time data from multiple sources into a single dashboard that would allow leaders to view key metrics, such as the number and type of requests and the quality of, and satisfaction with, the research services provided.

The search led TRI to a service management tool used by the institutional information technology (IT) department to address campus-wide, computer-related issues. The tool was already integrated into institutional human resources databases, thus allowing for easy capture of the contact information of researchers requesting services. Another positive aspect to using an existing product was widespread familiarity with the tool throughout the institution, since it had already been used by the IT department.

The tool enabled TRI to build two interfaces—a simple form that allows researchers to easily select services needed, and an administrative portal with dashboards that provide real-time visual representation of metrics and key performance indicators over time. The administrative portal also allows administrators to process incoming requests, view in-depth details about the requests (e.g., the IRB numbers and investigators’ demographics), track the fulfilment of those requests, track communication between TRI staff and the investigators, run metrics reports, and request feedback from researchers requesting services (see Figure 1). The data obtained are then shared with key leadership for data-driven decision-making.

Figure 1: Graphic Representation of the User and Administrative Back-End

Tracking Metrics for Research Services

TRI’s attempts to track research service utilization have undergone multiple iterations since 2014. Goals included determining if the new process implemented in 2016 made any difference in key metrics such as the number and type of services requested, and the quality of and satisfaction with services provided, both of which are expanded upon in the following sections.

Number and Type of Services Requested

The service management tool has allowed for better capture of the number and type of services requested. After the launch of the new system in March 2016, TRI staff noticed an increase in service requests. Monitoring of the data for six months (see Figure 2) revealed the unit was still not capturing all the services utilized; only the number of individual requests was known, and not the number of services requested in each request.

A modification implemented in the workflow in November 2016 resolved the deficiency. For instance, TRI received 64 individual requests in the month of January 2017, and was able to translate the specifics into a total of 244 services requests (an average of three to four research services per individual request). In addition to the number of requests, staff were able to capture the services most utilized (see Figure 3), and to view the number of requests that are completed and closed and the number still open in real time.

Figure 2: Research Services Requested Over Time (Through March 2017)

Figure 3: The 10 Most Requested Research Services (Through March 2017)

The most utilized service is a protocol development resource for investigators writing protocols for investigator-initiated research. Researchers are first provided with a variety of tools and customizable templates to write their draft protocol. Prior to IRB submission and upon request, skilled staff review the document for coherence of objectives, study design, and methodology. They further work with researchers to make recommended changes based upon local and federal regulations, IRB policies, Good Clinical Practice guidelines, and other considerations.

The general requests category is used when researchers have a question or request that does not fit into any of the other described service categories. Examples of requests falling into this category include questions about local research processes or policies, assay availability, and getting connected with someone in a particular area of expertise. Should a general request fit into one of the established services after further inquiry, TRI staff redirect the request to that service area.

The use of the service management tool has led to an increase in awareness of the type of services offered and, as noted in Figure 2, there has been a significant increase in the number of services requested. TRI staff are beginning to also track the potential impact of these services. Anecdotal feedback received indicated that researchers who utilize the protocol development services are less likely to have issues with the protocol when submitted to the IRB. Also, the biostatics services offered through TRI provide research design and analysis assistance to researchers.

Researchers are encouraged to seek consultations in the early development process of their research or grant ideas. With more than 50 faculty and staff, the biostatistics team has contributed to securing more than $91 million of national, peer-reviewed grants for researchers from 2014 to 2016. This amount includes $41 million received for the institution to act as a data coordinating center for a pediatrics clinical trials network.

Quality and Satisfaction With Services Provided

Assessing satisfaction of services provided is an integral part of the business community, but is something rarely adopted in the research services community. TRI wanted to use a simple, validated instrument to capture various aspects of research services, so staff modified a validated instrument known as ServQUAL, which assesses service quality on five dimensions.1 ServQUAL has been widely used in service-oriented industries1 and in hospital settings.2–5

Our final instrument included 10 items assessing three dimensions of quality applicable to research services on a Likert scale ranging from strongly disagree (1) to strongly agree (7). The dimensions being rated were: 1) Responsiveness—willingness to help and respond to investigator needs (four items); 2) Reliability—ability to perform research services dependably and accurately (three items); and 3) Empathy—the extent to which caring, individualized service is provided (three items). The instrument also included two items on a 10-point Likert scale assessing expectations and overall satisfaction with research services provided, and two spaces for open-ended feedback (e.g., on what we could have done differently and any particular person or issue that stood out).

A 2014 pilot of the instrument prior to the implementation of the service management tool received a 85.5% satisfaction rate. Subsequently, we observed improvements in perceptions of quality of services provided and satisfaction rates greater than 95% for most of the research service areas. We now provide monthly service utilization reports and annual satisfaction reports to each of the service areas.

Challenges and Limitations

TRI staff faced several challenges in striving to accurately capture the metrics needed to evaluate research services. Working with the IT department to understand the workflow process was the most challenging aspect of this process, in that it required (and continues to require) frequent communication. The main issue encountered was the inability to capture all the services requested (i.e., one researcher request was counted as one service requested instead of the potential multiple services per request).

Access to research services is through the TRI website. Although the website link is set to move directly to the research services form, customization of the menu of the service management tool is not possible, and it displays other menu options for services being used by the IT department. For instance, IT has menu links titled: Create a New Incidence or Create a New Problem, which can be confusing terminology for researchers. While having a separate iteration of the tool could address the menu issue, it is currently not financially feasible or cost effective to have a separate account.

Finally, institutional updates to the service management tool have periodically affected TRI’s established research workflows, such as resending completed projects as new ones or losing requests that were sent during the update period. Despite these challenges, the processes now implemented are providing data and information that could not be captured before. TRI will continue to use the service management tool while monitoring and addressing issues that arise.

Conclusion

Through this process, TRI staff have observed significant improvements in communication between researchers and service areas. This can be credited, in part, to including key stakeholders in the design and implementation of the new processes.

TRI has eliminated underutilized services, better categorized services, and expanded the services offered from 25 to 31. Staff have minimized investigator and provider issues by providing more clarity and specificity to the service request form. They can also track metrics such as number and type of services requested and cost of services provided (estimated cost savings and monetary value of using TRI services). The dashboard ensures that TRI can streamline and standardize research processes, and identify opportunities for improvement.

Overall, data-driven decision-making is increasingly important for forecasting and predictive measures. The service management tool has provided the TRI unit at UAMS with a way to track and assess utilization of research services in real time.

References

  1. Parasuraman A, Zeithaml VA, Berry LL. 1988. SERVQUAL: A multiple-item scale for measuring consumer perceptions of service quality. J Retailing 64(1):12–40.
  2. Babakus E, Mangold WG. 1992. Adapting the SERVQUAL scale to hospital services: an empirical investigation. Hlth Services Res 26(2):767–
  3. Martin S. 2003. Using SERVQUAL in health libraries across Somerset, Devon and Cornwall. Hlth Info and Libraries J 20:15–21.
  4. Dean AM. 1999. The applicability of SERVQUAL in different health care environments. Hlth Mktng Qtly 16:1–21.
  5. Anderson EA. 1995. Measuring service quality at a university health clinic. Int J Hlth Care Qual Assur 8:32–7.

All authors listed below are affiliated with the University of Arkansas for Medical Sciences.

Beatrice A. Boateng, PhD, (bboateng@uams.edu) is Director of Evaluation for the Translational Research Institute.

Amy Jenkins, MS, CCRP, CCRC, CCRA, is Executive Director of the Translational Research Institute.

Anthony McGuire is a Senior Healthcare Informatics Analyst.

Rhonda Jorden, MBA, is Vice Chancellor for Information Technology and Chief Information Officer.

Laura P. James, MD, is Director of the Translational Research Institute and Associate Vice Chancellor for Clinical and Translational Research.