Global Self-Assessment of Competencies, Role Relevance, and Training Needs Among Clinical Research Professionals

Competency-based education and training has been defined and applied by several groups in the clinical research enterprise,1-4 mostly through an approach focused on specific roles (e.g., investigator, pharmaceutical physician, or clinical research nurse). However, the Joint Task Force for Clinical Trial Competency (JTF) aligned and harmonized the many role-centered statements into a single framework of eight domains and 51 associated core competencies defining professional competence throughout clinical research roles.5 The resulting JTF Core Competency Framework (CCF) has been widely published, presented at scientific meetings, and applied by numerous organizations worldwide.

The Association of Clinical Research Professionals concentrates the elements of its Career Development Pathway, its professional certification programs, and its annual Meeting & Exposition structure based upon the CCF. Further, the Consortium of Academic Programs in Clinical Research and its member institutions have adopted the CCF to guide curriculum development and inform accreditation criteria for academic programs in clinical research.

The CCF is also being used to redefine job descriptions and support workforce development initiatives. For example, the Clinical and Translational Science Award (CTSA) Consortium has embraced the CCF as a structure for investigator and coordinator training.6

The JTF conducted a multinational survey of clinical research professionals, requesting that participants self-assess their competence levels and assess the significance of the specific core competencies to their current professional activities, as well as their perceived need for further training to enhance the performance quality of their roles. This survey was a first attempt to validate perceptions of competence and relevance of competencies by clinical research professionals, and further assesses self-reported learning needs for each competency.


Survey Tool and Participant Recruitment

An electronic survey tool was developed (through the online SurveyMonkey platform) for ease of digital distribution and response. The questionnaire included a demographic component and an assessment of perceived competence, relevance, and educational need across each of the CCF’s 51 competencies.

Individuals working in clinical research, inclusive of the roles of principal/co-principal investigator (PI/CoPI), clinical research associate (CRA), clinical research coordinator/nurse (CRC/CRN), data management (DM) professional, educator/trainer, pharmaceutical physician/medical director, regulatory affairs (RA) professional, and research administrator (including clinical research/project manager [RM/PM]) were targeted as survey participants.

The researchers used a snowball sampling approach to survey dissemination that included outreach through personal/professional contacts, e-mail listservs, presentations, and social media. The active collaboration of professional associations was also sought.

The survey was launched on December 12, 2014, and was formally closed on July 1, 2015. Participation in the survey was anonymous, with the SSL (Secure Sockets Layer) feature of SurveyMonkey protecting participant confidentiality.

The survey tool was pilot tested at the University of Michigan7 and granted expedited approval by the Eastern Michigan University Human Subjects Review Committee. Further, the University of Michigan (U-M) Institutional Review Board issued a “not regulated” determination for U-M’s role in analysis of de-identified data.

Demographic parameters collected in the initial segment of the survey are described in the survey tool, which can be found at Because this survey was devised as a snowball sample, population denominators could not be estimated.

In the survey’s invitation and introduction, competencies were defined as the “knowledge, skills, attitudes, and behaviors necessary for a particular set of tasks or objectives in a specific function.” Competence was defined as “the array of abilities across multiple domains or aspects of professional performance in a certain multidimensional and dynamic context.” A competent professional was defined as “one possessing the required abilities in all domains in a certain context at a defined stage of education or practice.”8

Respondents were asked to rate their own level of competence for each of the 51 core competencies, and the significance of each core competency to their current role using a five-point scale of 0–4 (see
Figure 1*).

Statistical Analysis Methods

As part of the analysis plan, the researchers translated results for “perception of competency” that included combined responses of 0, 1, and 2 from the competency key into a composite score of “0” (e.g., “less than competent”), and translated combined responses of 3 or 4 into a composite score of “1” (e.g., “competent”). This scale was also used for “perception of relevance to role.”

Moreover, for presenting “competence” or “relevance” scores by domain across roles, education, or experience, the researchers defined that a mean value of 0.6 or more implies “more competent” or “more relevant,” and a mean value of less than 0.6 implied “less competent” or “less relevant.”

Similarly, for measures of competence and relevance across roles and specific core competencies within a domain, the researchers defined that a score of 60% or more implies “more competent” or “more relevant” and a score of less than 60% implied “less competent” or “less relevant.”

It may be viewed as a limitation of this study that the authors made this decision somewhat arbitrarily, but it provided a means of discussing potential educational need. For the questions “need for additional education/training” per domain or core competency, “1” indicated “yes” and “0” indicated “no.”

The current levels of competence, significance to role, or need for training/education were analyzed across whole domains using one-way analysis of variance (ANOVA) or the Kruskal-Wallis test. A chi-square (X2) test was used to evaluate the current level of competence or significance to role for each 51 individual core competencies. Statistical analyses were performed using SAS software, Version 9.4 (SAS Institute Inc., Cary, N.C.).


Survey responses were received from 2,194 professionals from across the globe. A total of 1,738 respondents completed the demographic component of the survey and at least one response to the competency/relevance/training need component. Of those respondents, 1,584 were designated as DM, RA, CRC/CRN, CRA, RM/PM, or PI/CoPI; regional responses from this total are shown in Figure 2*.

Perceptions of Competence and Relevance

The self-perceived level of competence for survey participants by domain and role is shown in Table 1*. The roles of PI/CoPI and CRA had the highest self-perception of competence (in seven and six domains, respectively). Most members of the clinical research team indicated they believed they were competent (e.g., mean value of 0.6 or above) in the domains of “Ethical and Participant Safety Considerations” and “Clinical Trials Operations.”

The perceptions of competence in the domains of “Leadership and Professionalism” as well as “Communication and Teamwork” were high (> 60%) for most of the roles with the exception of DM and RA. Only the PI/CoPI role showed mean values of “competent” (≥ 0.6) in the domain of “Scientific Concepts and Research Design.” Furthermore, the mean value for all roles showed perceived lack of competence (< 0.6) in the domain of “Medicines Development and Regulation.”

The perceived relevance of each domain by role is shown in Table 2*. All roles but PI/CoPI perceived a low level of relevance to role for the “Scientific Concepts and Research Methods” domain. A low level of relevance (< 0.6) of the “Medicines Development and Regulation” domain was observed for all roles, including CRA and PI/CoPI.

Diving deeper into self-perceived “competence” or “relevance” response levels by role for each of the specific core competencies for the “Scientific Concepts and Research Design” domain, data are shown in Tables 3*, respectively. The PI/CoPI role had mean competence and relevance scores > 60%, compared to all other roles scoring well below 60% for competence and relevance for each competency in this domain.

There was a consistent low level of perceived competence across most roles for the seven competencies in the domain of “Medicines Development and Regulation” (see Table 5*). However, the self-perceived competency for the RA role (> 60%) is seen in three of the seven competencies in this domain. The core competence related to safety reporting requirements was rated in the “competent” range for the PI/CoPI, PM/RM, and RA roles.

When analyzing self-perceived competence and relevance by domain and academic degree level, the domain “Scientific Concepts and Research Design” again lags in perceived confidence across all degree levels, with the exception of the doctorate level. There is a consistent low level of confidence (< 0.6) across all degree levels in the “Medicines Development and Regulation” domain. Similar findings are shown for perceived relevance in these two domains (see Tables 6 and 7*).

The domain “Communication and Teamwork” was self-perceived at the competent level for those possessing a postbaccalaureate degree or above; however, relevance for this domain was perceived as high (≥ 0.6) for all degree levels, with the exception of those with a baccalaureate degree, which scored at 0.5.

Levels of perceived competence and relevance to role by years of experience in the clinical research enterprise were also analyzed (see Tables 8 and 9). With the exception of the two domains, “Scientific Concepts and Research Design” and “Medicines Development and Regulation” (both averaging < 0.6), there are increasing levels self-perceived competence with years of experience. For all domains, self-assessed competence increases as professionals have six to 10 years of experience; thereafter, self-assessed competence levels off.

Self-assessment of relevance to the role does not rise with increasing experience, however. The perceived relevance of the domain to the role is virtually the same in those with less than two years of experience as for those with more than 20 years of experience.

Perceptions of Learning Needs

The perceived need for additional education/training is reported as an average percentage of “yes” responses with each of the competency domains, broken down by role. For the purposes of this paper, we have highlighted percentages > 50% in Table 10*. The lowest perceived need for training was expressed by the PM/RM role. The roles of CRA and PI/CoPI expressed a need for additional education/training at rates > 50% for all domains.


The increasing complexity, growth projections, and personnel needs of the clinical research enterprise have been widely reported; there is a need to expand and better qualify the clinical research workforce to meet those needs. The Institute of Medicine projected these factors and initiated a call for development of the entire clinical research workforce.9

Today, there are reported shortages of CRA personnel.10 As this growth and complexity has occurred, working groups in nursing, medicine, and clinical research have sought to understand and categorize the requisite knowledge and skills needed to meet the demands. The JTF CCF emerged as a harmonization of those efforts.

Academic programs in clinical research are seeking to prepare an educated workforce by utilizing the JTF CCF to develop curricula that are responsive to needs of the enterprise using a competency-based education approach.11,12 Support for education and professionalization of clinical research professionals have been widely promoted, but gaps remain.13,14

Leaders at academic medical center sites are beginning to pattern their curricula to the JTF CCF, and even to explore how the JTF CCF may inform job descriptions and progression pathways; however, consistency in site onboarding training and ongoing training of clinical research staff are lacking.

In presenting preliminary data from the JTF survey, this paper represents a first attempt to measure perceived competence and relevance of the domains and competencies of the JTF CCF across multiple roles. It also serves to assess and present perceived learning needs across roles for the JTF CCF domains and competencies.

The results demonstrate variations in the respondents’ perceived competence or perceived relevance of domains/competencies for their roles.

Competence and relevance gaps are suggested for two key JTF domains. Across all roles, the scores for competence and relevance were perceived as low for the “Medicines Development and Regulation” domain. Likewise, similar gaps were seen for the “Scientific Concepts and Research Design” domain, with the exception of in the PI/CoPI role.

With the exception of “Ethical and Participant Safety Considerations” and “Clinical Trials Operations,” there were low perceived competence and relevance across all domains for the RA role. The DM role perceived competence and relevance in data management, yet had lower scores across all other domains in both areas.

Perceived competence increased with years of experience and with postsecondary education. Moreover, the domains “Medicines Development and Regulation” and “Scientific Concepts and Research Design” showed increases at the Masters degree level. Results also suggest that most clinical research professionals, including those in the PI/CoPI role, perceive a need for additional education/training.

The limitations inherent in this survey include the fact that it was disseminated broadly using a snowball method. Therefore, conclusions cannot be generalized to larger populations; however, they are suggestive based on the responses of participants.

Moreover, there was significant survey fatigue across respondents in the survey, due to the length and design of the survey tool. Many respondents did not complete the entire survey, which is a recognized limitation of long surveys.15

Finally, measuring perceptions of competence and relevance can be fraught with bias, as often those who are less experienced or educated may inflate their perceptions. At the same time, those who have higher education and experience may realize the breadth of knowledge yet to be gained, and rate themselves as requiring more education to meet competency demands.16,17

Considering the rising complexity of the clinical research enterprise—and the need for an interdisciplinary team approach to managing studies across medical disciplines and across clinical research personnel roles—more focused approaches to job descriptions, role responsibilities, and educational pathways are warranted. Despite a low perceived relevance of some domains by role of some respondents, the levels of decision-making and requisite needs of today’s research enterprise suggest that a minimum entry level of education should be defined and required, and that intentional onboarding and staged education and continuing professional development in each domain should occur—even at the lowest role level.

Clinical research professionals, including PIs/CoPIs, should be educated and trained across all domains at levels in keeping with their responsibilities. The current International Conference on Harmonization E6 Good Clinical Practice training of both new and experienced investigators and staff should be generally perceived as a “floor,” not a “ceiling,” for the knowledge necessary to conduct a safe and accurate clinical trial.18 While all domains should be included in curricula, increased content that focuses on “Scientific Concepts and Research Design” and “Medicines Development and Regulation” is indicated.

It would appear that, in today’s clinical research enterprise, the time honored “learning on the job” is no longer sufficient to produce a qualified clinical research professional and ensure proper conduct of research and protection of human participants.


The results of this survey illustrate gaps in perceived competence and relevance in the domains associated with drug, device, and biologics development and in the domain of “Scientific Concepts and Research Design,” the basis of clinical research studies. However, it also provides an opportunity for further explorations on core competence for clinical research professionals.

The workforce needs are ever expanding; the model for hiring in the field is still based upon experience, not necessarily competence, and there are no entry-level educational requirements. Professional certifications exist for those who have achieved a defined professional experience level in a clinical research area; however, validated, evidence-based competency measures for the workforce have been lacking.

The JTF CCF has gained acceptance as an important response to the necessity for better definitions of the basic competencies for clinical research professionals. This work is not done; new stakeholders are joining the JTF. Therefore, additional core competencies are likely to emerge.

As the clinical research enterprise embraces the professionalization of roles, this survey not only identifies potential needs, but also stimulates conversations about minimal education requirements; definition of roles; standardization of job titles at ascending levels of competence; policies for staff training; and potential new research on the application of these core competencies.

This paper presents only one portion of the data gleaned from the JTF survey. Results that assess regional differences of respondents may identify learning needs in specific geographic areas.



  1. Silva H, Stonier P, Buhler F, Deslypere J-P, Criscuolo D, Nell G, Massud J, Kesselring G, van Olden R, Dubois D. 2013. Core competencies for pharmaceutical physicians and drug development scientists. Frontiers in Pharmacology 4(105):1–7.
  2. Koren M, Koski G, Reed DP, Rheinstein PH, Silva H, Stonier P, Seltzer J. 2011. APPI physician investigator competencies statement. The Monitor 25(4):79–82.
  3. Clinical and Translational Science Awards. 2011. Core competencies for clinical and translational research.
  4. Jones CT, Parmentier J, Sonstein S, Silva H, Lubejko B, Pidd H, Gladson B, Browning S. 2012. Defining competencies in clinical research: issues in clinical research education. Res Pract 13(3):99–107.
  5. Sonstein S, Seltzer J, Li R, Jones CT, Silva H, Daemen E. 2014. Moving from compliance to competency: a harmonized core competency framework for the clinical research professional. Clin Res 28(3):17–23.
  6. Calvin-Naylor N, Wartak M, Jones C, et al. (2016). Education and training of clinical and translational study investigators and research coordinators: a competency-based approach. J Clin Trans Sci (Submitted May 2016).
  7. Bebee P. 2015. A pilot study to assess the validity of the Joint Task Force’s questionnaire
    for collection of data to be used in defining job descriptions, educational requirements,
    boundaries of practice, and promotion criteria for the clinical research enterprise. MS Thesis for Eastern Michigan University.
  8. Frank JR, Snell L, Ten Cate O, Holmboe ES, Carraccio C, Swing SR, et al. 2010. Competency-based medical education: theory to practice. J T Med Teach 32(8):638–45. [DOI: 10.3109/0142159X.2010.501190]
  9. Institutes of Medicine. 2012. Envisioning a transformed clinical trials enterprise in the United States: establishing an agenda for 2020: workshop summary. Washington, D.C.: The National Academies Press.
  10. Association of Clinical Research Professionals. 2015. A new approach to developing the CRA
  11. Consortium of Academic Programs in Clinical Research.
  12. Jones CJ, Gladson B, Butler J. 2015. Academic programs that produce clinical research professionals. DIA Global Forum. 7(5):16–9.
  13. Stevens EJ, Daemen E. The professionalization of research coordinators. Clin Res 29(6):26–31.
  14. Association of Clinical Research Professionals. 2016. 5 action items to accelerate the professionalization of clinical research. ACRP Blog.
  15. Galesic M, Bosnjak M. 2009. Effects of questionnaire length on participation and indicators of response quality in a web survey. Pub Op Quart 73:349–60.
  16. Kruger J, Dunning D. 1999. Unskilled and unaware of it: how difficulties in recognizing
    one’s own incompetence lead to inflated selfassessments. J Pers Soc Psych 77(6):1121–34.
  17. Ehrlinger J, Johnson K, Banner M, Dunning D, Kruger J. 2008. Why the unskilled are unaware: further explorations of (absent) self-insight among the incompetent. Org Beh Hum Dec Proc 105(1):98–121.
  18. Arango J, Chuck T, Ellenberg SS, Fotz B, Gorman C, Hinrich H, McHale S, Merchant K, Seltzer J, Shapley S, Wild G. 2016. Good clinical practice training: identifying key elements and strategies for increasing training efficiency. Ther Inn Reg Sci 50(4):480–6.

Stephen Sonstein, PhD, ( is director of clinical research administration at Eastern Michigan University.

Honorio Silva, MD, is president of the International Federation of Associations of Pharmaceutical Physicians and Pharmaceutical Medicine, and vice president for systems integration and professionalism with the Alliance for Clinical Research Excellence and Safety.

Carolynn Thomas Jones, DNP, MSPH, RN, ( is an assistant professor of clinical nursing at The Ohio State University’s College of Nursing and lead faculty of the Master of Applied Clinical and Preclinical Research program. She is also past-president of the Consortium of Academic Programs in Clinical Research and a member of the Joint Task Force for Clinical Trial Competency.

Nancy Calvin-Naylor, PhD, is managing director of the Institute for Research on Science and Innovation at the University of Michigan.

Laurie Halloran, BSN, MS, is president of Halloran Consulting Group, Inc., and a faculty instructor in
clinical research with the Boston University School of Medicine.

Juan Luis Yrivarren, MD, is Andean and Mexico regional director with CCBR Clinical Research.

[DOI: 10.14524/CR-16-0016]

*To see all figures and/or tables published originally in this article, please visit the full-issue PDF of the December 2016 Clinical Researcher.