• Open access
  • Published: 27 October 2005

A framework to evaluate research capacity building in health care

  • Jo Cooke 1  

BMC Family Practice volume  6 , Article number:  44 ( 2005 ) Cite this article

37k Accesses

199 Citations

12 Altmetric

Metrics details

Building research capacity in health services has been recognised internationally as important in order to produce a sound evidence base for decision-making in policy and practice. Activities to increase research capacity for, within, and by practice include initiatives to support individuals and teams, organisations and networks. Little has been discussed or concluded about how to measure the effectiveness of research capacity building (RCB)

This article attempts to develop the debate on measuring RCB. It highlights that traditional outcomes of publications in peer reviewed journals and successful grant applications may be important outcomes to measure, but they may not address all the relevant issues to highlight progress, especially amongst novice researchers. They do not capture factors that contribute to developing an environment to support capacity development, or on measuring the usefulness or the 'social impact' of research, or on professional outcomes.

The paper suggests a framework for planning change and measuring progress, based on six principles of RCB, which have been generated through the analysis of the literature, policy documents, empirical studies, and the experience of one Research and Development Support Unit in the UK. These principles are that RCB should: develop skills and confidence, support linkages and partnerships, ensure the research is 'close to practice', develop appropriate dissemination, invest in infrastructure, and build elements of sustainability and continuity. It is suggested that each principle operates at individual, team, organisation and supra-organisational levels. Some criteria for measuring progress are also given.

This paper highlights the need to identify ways of measuring RCB. It points out the limitations of current measurements that exist in the literature, and proposes a framework for measuring progress, which may form the basis of comparison of RCB activities. In this way it could contribute to establishing the effectiveness of these interventions, and establishing a knowledge base to inform the science of RCB.

Peer Review reports

The need to develop a sound scientific research base to inform service planning and decision-making in health services is strongly supported in the literature [ 1 ], and policy [ 2 ]. However, the level of research activity and the ability to carry out research is limited in some areas of practice, resulting in a low evidence base in these areas. Primary Care, for example, has been identified as having a poor capacity for undertaking research [ 3 – 5 ], and certain professional groups, for example nursing and allied health professionals, lack research experience and skills [ 5 – 7 ]. Much of the literature and the limited research on research capacity building (RCB) has therefore focused on this area of practice, and these professional groups. Policy initiatives to build research capacity include support in developing research for practice, where research is conducted by academics to inform practice decision making, research within or through practice, which encompasses research being conducted in collaboration with academics and practice, and research by practice, where ideas are initiated and research is conducted by practitioners [ 3 , 8 ].

The interventions to increase research capacity for, within, and by practice incorporates initiatives to support individuals and teams, organisations and networks. Examples include fellowships, training schemes and bursaries, and the development of support infrastructures, for example, research practice networks [ 9 – 13 ]. In the UK, the National Coordinating Centre for Research Capacity Development has supported links with universities and practice through funding a number of Research and Development Support Units (RDSU) [ 14 ]which are based within universities, but whose purpose is to support new and established researchers who are based in the National Health Service (NHS). However, both policy advisers and researchers have highlighted a lack of evaluative frameworks to measure progress and build an understanding of what works[ 15 , 16 ].

This paper argues for a need to establish a framework for planning and measuring progress, and to initiate a debate about identifying what are appropriate outcomes for RCB, not simply to rely on things that are easy to measure. The suggested framework has been generated through analysis of the literature, using policy documents, position statements, a limited amount of empirical studies on evaluating research RCB, and the experience of one large RSDU based in the UK.

The Department of Health within the UK has adopted the definition of RCB as 'a process of individual and institutional development which leads to higher levels of skills and greater ability to perform useful research". (pp1321) [ 17 ]

Albert & Mickan cited the National Information Services in Australia [ 18 ] who define it as

" an approach to the development of sustainable skills, organizational structures, resources and commitment to health improvement in health and other sectors to multiply health gains many times over.'

RCB can therefore be seen as a means to an end, the end being 'useful' research that informs practice and leads to health gain, or an end in itself, emphasising developments in skills and structures enabling research to take place.

A framework for measuring capacity building should therefore be inclusive of both process and outcome measures [ 19 ], to capture changes in both the 'ends' and 'means'; it should measure the ultimate goals, but also measure the steps and mechanisms to achieve them. The notion of measuring RCB by both process and outcome measures is supported within the research networks literature [ 12 , 20 ], and capacity building in health more generally [ 19 , 21 ]. Some argue we should acknowledge 'process as outcome', particularly if capacity building is seen as an end in itself [ 21 ]. In this context process measures are 'surrogate' [ 12 ], or 'proxy' outcome measures[ 16 ]. Carter et al [ 16 ]stress caution in terms of using 'proxy' measures in the context of RCB, as there is currently little evidence to link process with outcome. They do not argue against the notion of collecting process data, but stress that evaluation work should examine the relationship of process to outcome. The proposed framework discussed in this paper suggests areas to consider for both process and outcome measurement.

The most commonly accepted outcomes for RCB cited in the literature includes traditional measures of high quality research including publications, conference presentations, successful grant applications, and qualifications obtained. Many evaluations of RCB have used these as outcomes [ 9 , 10 , 22 , 23 ]. Some argue that publications in peer reviewed journals are a tall order for the low research skills base in some areas of health care practice [ 5 ], and argue for an appropriate time frame to evaluate progress. Process measures in this context could measure progress more sensitively and quickly.

However, using traditional outcomes may not be the whole story in terms of measuring impact. Position statements suggest that the ultimate goal of research capacity building is one of health improvement [ 17 , 18 , 24 ]. In order for capacity building initiatives to address these issues, outcomes should also explore the direct impact on services and clients: what Smith [ 25 ]defines as the social impact of research.

There is a strong emphasis within the primary care literature that capacity building should enhance the ability of practitioners to build their research skills: to support the development of research 'by' and 'with' practice [ 3 , 26 ], and suggests 'added value' to develop such close links to practice. A framework to measure RCB should explore and try to unpack this 'added value', both in terms of professional outcomes,[ 10 ] which include increasing professional enthusiasm, and supporting the application of critical thinking, and the use of evidence in practice. Whilst doing research alongside practice is not the only way these skills and attitudes can be developed, it does seem to be an important impact of RCB that should be examined.

The notion of developing RCB close to practice does not necessarily mean that it is small scale just because it is close to the coal face. Obviously, in order for individuals and teams to build up a track record of experience their initial projects may justifiably be small scale, but as individual's progress, they may gain experience to be able to conduct large scale studies, still based on practice problems, working in partnership with others. Similarly networks can support large scale studies as their capacity and infrastructure is developed to accommodate them.

The framework

The framework is represented by Figure 1 . It has two dimensions

figure 1

Research Capacity Building: A Framework for Evaluation.

• Four structural levels of development activity . These include individual, team, organisational, and the network or supra- organisational support level (networks and support units). These are represented by the concentric circles within the diagram.

• Six principles of capacity building . This are discussed in more detail below but include: building skills and confidence, developing linkages and partnerships, ensuring the research is 'close to practice', developing appropriate dissemination, investments in infrastructure, and building elements of sustainability and continuity. Each principle is represented by an arrow within the diagram, which indicates activities and processes that contribute towards capacity building. The arrows cut across the structural levels suggesting that activities and interventions may occur within, and across, structural levels. The arrow heads point in both directions suggesting that principles applied to each structural level could have an impact on other levels.

The framework acknowledges that capacity building is conducted within a policy context. Whilst this paper focuses on measurement at different structural levels, it should be acknowledged that progress and impact on RCB can be greatly nurtured or restricted by the prevailing policy. Policy decisions will influence opportunities for developing researchers, can facilitate collaborations in research, support research careers, fund research directed by practice priorities, and can influence the sustainability and the very existence of supportive infrastructures such as research networks.

The paper will explain the rationale for the dimensions of the framework, and then will suggest some examples of measurement criteria for each principle at different structural levels to evaluate RCB. It is hope that as the framework is applied, further criteria will be developed, and then used taking into account time constraints, resources, and the purpose of such evaluations.

Structural levels at which capacity building takes place

The literature strongly supports that RCB should take place at an individual and organisational level [ 8 , 15 , 27 , 28 ]. For example, the conceptual model for RCB in primary care put forward by Farmer & Weston [ 15 ] focuses particularly on individual General Practitioners (GPs) and primary care practitioners who may progress from non participation through participation, to become academic leaders in research. Their model also acknowledges the context and organisational infrastructure to support RCB by reducing barriers and accommodating diversity through providing mentorship, collaborations and networking, and by adopting a whole systems approach based on local need and existing levels of capacity. Others have acknowledged that capacity development can be focussed at a team level [ 11 , 29 ]. Jowett et al [ 30 ] found that GPs were more likely to be research active if they were part of a practice where others were involved with research. Guidance from a number of national bodies highlights the need for multiprofessional and inter-professional involvement in conducting useful research for practice [ 3 , 4 , 6 , 31 ] which implies an appropriate mix of skills and practice experience within research teams to enable this [ 32 ]. Additionally, the organisational literature has identified the importance of teams in the production of knowledge [ 18 , 33 , 34 ].

Developing structures between and outside health organisations, including the development of research networks seems important for capacity building [ 12 , 24 , 34 ]. The Department of Health in the UK [ 14 ] categorizes this supra-organisational support infrastructure to include centres of academic activity, Research & Development Support Units, and research networks.

As interventions for RCB are targeted at different levels, the framework for measuring its effectiveness mirrors this. However, these levels should not be measured in isolation. One level can have an impact on capacity development at another level, and could potentially have a synergistic or detrimental effect on the other.

The six principles of research capacity building

Evaluation involves assessing the success of an intervention against a set of indicators or criteria [ 35 , 36 ], which Meyrick and Sinkler [ 37 ] suggest should be based on underlying principles in relation to the initiative. For this reason the framework includes six principles of capacity building. The rationale for each principle is given below, along with a description of some suggested criteria for each principle. The criteria presented are not an exhaustive list. As the framework is developed and used in practice, a body of criteria will be developed and built on further.

Principle 1. Research capacity is built by developing appropriate skills, and confidence, through training and creating opportunities to apply skills

The need to develop research skills in practitioners is well established [ 3 , 4 , 6 ], and can be supported through training [ 14 , 26 ], and through mentorship and supervision [ 15 , 24 , 28 ]. There is some empirical evidence that research skill development increases research activity [ 23 , 38 ], and enhances positive attitudes towards conducting and collaborating in research [ 39 ]. Other studies cite lack of training and research skills as a barrier to doing research [ 30 , 31 ]. The need to apply and use research skills in practice is highlighted in order to build confidence [ 40 ]and to consolidate learning.

Some needs assessment studies highlight that research skills development should adopt 'outreach' and flexible learning packages and acknowledge the skills, background and epistemologies of the professional groups concerned [ 7 , 15 , 39 , 41 , 42 ]. These include doctors, nurses, a range of allied health professional and social workers. Developing an appropriate mix of professionals to support health services research means that training should be inclusive and appropriate to them, and adopt a range of methodologies and examples to support appropriate learning and experience [ 15 , 31 , 41 ]. How learning and teaching is undertaken, and the content of support programmes to reflect the backgrounds, tasks and skills of participants should therefore be measured. For example, the type of research methods teaching offered by networks and support units should reflect a range and balance of skills needed for health service research, including both qualitative and quantitative research methods.

Skills development also should be set in the context of career development, and further opportunities to apply skills to practice should be examined. Policy and position statements [ 14 , 26 ] support the concept of career progression or 'careers escalator', which also enables the sustainability of skills. Opportunities to apply research skills through applications for funding is also important [ 9 , 10 , 22 , 43 , 44 ].

At team and network level Fenton et al [ 34 ]suggest that capacity can be increased through building intellectual capacity (sharing knowledge), which enhances the ability to do research. Whilst there is no formal measure for this, an audit of the transfer of knowledge would appear to be beneficial. For example teams may share expertise within a project to build skills in novice researchers [ 45 ]which can be tracked, and appropriate divisions of workload through reading research literature and sharing this with the rest of the team/network could be noted.

The notion of stepping outside of a safety zone may also suggest increased confidence and ability to do research. This may be illustrated at an individual level by the practitioner-researcher taking on more of a management role, supervising others, or tackling new methodologies/approaches in research, or in working with other groups of health and research professionals on research projects. This approach is supported by the model of RCB suggested by Farmer and Weston [ 15 ] which supports progress from participation through to academic leadership.

Some examples of criteria for measuring skills and confidence levels are give in table 1 .

Principle 2. Research capacity building should support research 'close to practice' in order for it to be useful

The underlying philosophy for developing research capacity in health is that it should generate research that is useful for practice. The North American Primary Care Group [ 24 ] defined the 'ultimate goal' of research capacity development as the generation and application of new knowledge to improve the health of individuals and families (p679). There is strong support that 'useful' research is that which is conducted 'close' to practice for two reasons. Firstly by generating research knowledge that is relevant to service user and practice concerns. Many argue that the most relevant and useful research questions are those generated by, or in consultation with, practitioners and services [ 3 , 11 , 24 ], policy makers [ 46 ] and service users [ 47 , 48 ]. The level of 'immediate' usefulness [ 49 ] may also mean that messages are more likely to taken up in practice[ 50 ]. Empirical evidence suggests that practitioners and policy makers are more likely to engage in research if they see its relevance to their own decision making [ 31 , 39 , 46 ]. The notion of building research that is 'close to practice' does not necessarily mean that they are small scale, but that the research is highly relevant to practice or policy concerns. A large network of practitioners could facilitate large scale, experimental based projects for example. However, the adoption of certain methodologies is more favoured by practice because of their potential immediate impact on practice [ 47 ] and this framework acknowledges such approaches and their relevance. This includes action research projects, and participatory inquiry [ 31 , 42 ]. An example where this more participatory approach has been developed in capacity building is the WeLREN (West London Research Network) cycle [ 51 ]. Here research projects are developed in cycles of action, reflection, and dissemination, and use of findings is integral to the process. This network reports high levels of practitioner involvement.

Secondly, building research capacity 'close to practice' is useful because of the skills of critical thinking it engenders which can be applied also to practice decision making [ 28 ], and which supports quality improvement approaches in organisations [ 8 ]. Practitioners in a local bursary scheme, for example, said they were more able to take an evidence-based approach into their every day practice [ 9 ].

Developing a 'research culture' within organisations suggests a closeness to practice that impacts on the ability of teams and individuals to do research. Lester et al [ 23 ] touched on measuring this idea through a questionnaire where they explored aspects of a supportive culture within primary care academic departments. This included aspects around exploring opportunities to discuss career progression, supervision, formal appraisal, mentorship, and junior support groups. This may be a fruitful idea to expand further to develop a tool in relation to a health care environment.

Some examples of criteria for measuring the close to practice principle are give in table 2

3. Linkages, partnerships and collaborations enhance research capacity building

The notion of building partnerships and collaborations is integral to capacity building [ 19 , 24 ]. It is the mechanism by which research skills, and practice knowledge is exchanged, developed and enhanced [ 12 ], and research activity conducted to address complex health problems [ 4 ]. The linkages between the practice worlds and that of academia may also enhance research use and impact [ 46 ].

The linkages that enhance RCB can exist between

Universities and practice [ 4 , 14 , 43 ]

Novice and experienced researchers [ 22 , 24 , 51 ].

Different professional groups [ 2 , 4 , 20 , 34 ]

Different health and care provider sectors [ 4 , 31 , 47 , 52 ]

Service users, practitioners and researchers [ 47 , 48 ]

Researchers and policy makers [ 46 ]

Different countries [ 28 , 52 ]

Health and industry [ 53 , 54 ]

It is suggested that it is through networking and building partnerships that intellectual capital (knowledge) and social capital (relationships) can be built, which enhances the ability to do research [ 12 , 31 , 34 ]. In particular, there is the notion that the build up of trust between different groups and individuals can enhance information and knowledge exchange[ 12 ]. This may not only have benefits for the development of appropriate research ideas, but may also have benefits for the whole of the research process including the impact of research findings.

The notion of building links with industry is becoming progressively evident within policy in the UK [ 54 ] which may impact on economic outcomes to health organisations and the society as a whole[ 55 , 56 ].

Some examples of criteria for measuring linkages and collaborations are given in table 3 .

4. Research capacity building should ensure appropriate dissemination to maximize impact

A widely accepted measure to illustrate the impact of RCB is the dissemination of research in peer reviewed publications, and through conference presentations to academic and practice communities [ 5 , 12 , 26 , 57 ]. However this principle extends beyond this more traditional method of dissemination. The litmus test that ultimately determines the success of capacity building is that it should impact on practice, and on the health of patients and comminutes[ 24 ] that is; the social impact of research [ 25 ]. Smith [ 25 ]argues that the strategies of dissemination should include a range of methods that are 'fit for purpose'. This includes traditional dissemination, but also includes other methods, for example, instruments and programmes of care implementation, protocols, lay publications, and publicity through factsheets, the media and the Internet.

Dissemination and tracking use of products and technologies arising from RCB should also be considered, which relate to economic outcomes of capacity building [ 55 ]. In the UK, the notion of building health trusts as innovative organisations which can benefit economically through building intellectual property highlights this as an area for potential measurement [ 56 ].

Some examples of criteria for measuring appropriate dissemination are given in table 4

5. Research capacity building should include elements of continuity and sustainability

Definitions of capacity building suggest that it should contain elements of sustainability which alludes to the maintenance and continuity of newly acquired skills and structures to undertake research [ 18 , 19 ]. However the literature does not explore this concept well [ 19 ]. This in itself may be partly due problems around measuring capacity building. It is difficult to know how well an initiative is progressing, and how well progress is consolidated, if there are no benchmarks or outcomes against which to demonstrate this.

Crisp et al [ 19 ] suggests that capacity can be sustained by applying skills to practice. This gives us some insight about where we might look for measures of sustainability. It could include enabling opportunities to extend skills and experience, and may link into the concept of a career escalator. It also involves utilizing the capacity that has been already built. For example engaging with those who have gained skills in earlier RCB initiatives to help more novice researchers, once they have become 'experts', and in finding an appropriate place to position the person with expertise with the organisation. It could also be measured by the number of opportunities for funding for continued application of skills to research practice.

Some examples of criteria for measuring sustainability and continuity are gibe in table 5

6. Appropriate infrastructures enhance research capacity building

Infrastructure includes structures and processes that are set up to enable the smooth and effective running of research projects. For example, project management skills are essential to enable projects to move forward, and as such should be measured in relation to capacity building. Similarly, projects should be suitably supervised with academic and management support. To make research work 'legitimate' it may be beneficial to make research a part of some job descriptions for certain positions, not only to reinforce research as a core skill and activity, but also to review in annual appraisals, which can be a tool for research capacity evaluation. Information flow about calls for funding and fellowships and conferences is also important. Hurst [ 42 ] found that information flow varied between trusts, and managers were more aware of research information than practitioners.

The importance of protected time and backfill arrangements as well as funding to support this, is an important principle to enable capacity building [9, 15, 24, 58]. Such arrangements may reduce barriers to participation and enable skills and enthusiasm to be developed[ 15 ]. Infrastructure to help direct new practitioners to research support has also been highlighted[ 14 ]. This is particularly true in the light of the new research governance and research ethics framework in the UK [59]. The reality of implementing systems to deal with the complexities of the research governance regulations has proved problematic, particularly in primary care, where the relative lack of research management expertise and infrastructure has resulted in what are perceived as disproportionately bureaucratic systems. Recent discussion in the literature has focused on the detrimental impact of both ethical review, and NHS approval systems, and there is evidence of serious delays in getting research projects started [60]. Administrative and support staff to help researchers through this process is important to enable research to take place [61].

Some examples of criteria for measuring are given in table 6 .

This paper suggests a framework which sets out a tentative structure by which to start measuring the impact of capacity building interventions, and invites debate around the application of this framework to plan and measure progress. It highlights that interventions can focus on individuals, teams, organisations, and through support infrastructures like RDSUs and research networks. However, capacity building may only take place once change has occurred at more than one level: for example, the culture of an organisation in which teams and individuals work may have an influence of their abilities and opportunities to do research work. It is also possible that the interplay between different levels may have an effect on the outcomes at other levels. In measuring progress, it should be possible to determine a greater understanding of the relationship between different levels. The framework proposed in this paper may be the first step to doing this.

The notion of building capacity at any structural level is dependent on funding and support opportunities, which are influenced by policy and funding bodies. The ability to build capacity across the principles developed in the framework will also be dependent of R&D strategy and policy decisions. For example, if policy fluctuates in its emphasis on building capacity 'by', 'for' or 'with' practice, the ability to build capacity close to practice will be affected.

In terms of developing a science of RCB, there is a need to capture further information on issues of measuring process and outcome data to understand what helps develop 'useful' and 'useable' research. The paper suggests principles whereby a number of indicators could be developed. The list is not exhaustive, and it is hoped that through debate and application of the framework further indicators will be developed.

An important first step to building the science of RCB should be debate about identifying appropriate outcomes. This paper supports the use of traditional outcomes of measurement, including publications in peer reviewed journals and conference presentations. This assures quality, and engages critical review and debate. However, the paper also suggests that we might move on from these outcomes in order to capture the social impact of research, and supports the notion of developing outcomes which measure how research has had an impact on the quality of services, and on the lives of patients and communities. This includes adopting and shaping the type of methodologies that capacity building interventions support, which includes incorporating patient centred outcomes in research designs, highlighting issues such as cost effectiveness of interventions, exploring economic impact of research both in terms of product outputs and health gain, and in developing action oriented, and user involvement methodologies that describe and demonstrate impact. It also may mean that we have to track the types of linkages and collaborations that are built through RCB, as linkages that are close to practice, including those with policy makers and practitioners, may enhance research use and therefore 'usefulness'. If we are to measure progress through impact and change in practice, an appropriate time frame would have to be established alongside these measures.

This paper argues that 'professional outcomes' should also be measured, to recognize how critical thinking developed during research impacts on clinical practice more generally.

Finally, the proposed framework provides the basis by which we can build a body of evidence to link process to the outcomes of capacity building. By gathering process data and linking it to appropriate outcomes, we can more clearly unpack the 'black box' of process, and investigate which processes link to desired outcomes. It is through adopting such a framework, and testing out these measurements, that we can systematically build a body of knowledge that will inform the science and the art of capacity building in health care.

• There is currently little evidence on how to plan and measure progress in research capacity building (RCB), or agreement to determining its ultimate outcomes.

• Traditional outcomes of publications in peer reviewed journals, and successful grant applications may be the easy and important outcomes to measure, but do not necessarily address issues to do with the usefulness of research, professional outcomes, the impact of research activity on practice, or on measuring health gain.

• The paper suggests a framework which provides a tentative structure by which measuring the impact of RCB could be achieved, shaped around six principles of research capacity building, and includes four structural levels on which each principle can be applied.

• The framework could be the basis by which RCB interventions could be planned, and progress measured. It could act as a basis of comparison across interventions, and could contribute to establishing a knowledge base on what is effective in RCB in healthcare

Muir Gray JA: Evidence-based Healthcare. How to make health policy and management decisions. 1997, Edinburgh, Churchill Livingstone

Google Scholar  

Department of Health: Research and Development for a First Class Service. 2000, Leeds, DoH

Mant D: National working party on R&D in primary care. Final Report. 1997, London, NHSE South and West.

Department of Health: Strategic review of the NHS R&D Levy (The Clarke Report). 1999, , Central Research Department, Department of Health, 11-

Campbell SM, Roland M, Bentley E, Dowell J, Hassall K, Pooley J, Price H: Research capacity in UK primary care. British Journal of General Practice. 1999, 49: 967-970.

CAS   PubMed   PubMed Central   Google Scholar  

Department of Health: Towards a strategy for nursing research and development. 2000, London, Department of Health

Ross F, Vernon S, Smith E: Mapping research in primary care nursing: Current activity and future priorities. NT Research. 2002, 7: 46-59.

Article   Google Scholar  

Marks L, Godfrey M: Developing Research Capacity within the NHS: A summary of the evidence. 2000, Leeds, Nuffield Portfolio Programme Report.

Lee M, Saunders K: Oak trees from acorns? An evaluation of local bursaries in primary care. Primary Health Care Research and Development. 2004, 5: 93-95. 10.1191/1463423604pc197xx.

Bateman H, Walter F, Elliott J: What happens next? Evaluation of a scheme to support primary care practitioners with a fledgling interest in research. Family Practice. 2004, 21: 83-86. 10.1093/fampra/cmh118.

Article   PubMed   Google Scholar  

Smith LFP: Research general practices: what, who and why?. British Journal of General Practice. 1997, 47: 83-86.

Griffiths F, Wild A, Harvey J, Fenton E: The productivity of primary care research networks. British Journal of General Practice. 2000, 50: 913-915.

Fenton F, Harvey J, Griffiths F, Wild A, Sturt J: Reflections from organization science of primary health care networks. Family Practice. 2001, 18: 540-544. 10.1093/fampra/18.5.540.

Article   CAS   PubMed   Google Scholar  

Department of Health: Research Capacity Development Strategy. 2004, London, Department of Health

Farmer E, Weston K: A conceptual model for capacity building in Australian primary health care research. Australian Family Physician. 2002, 31: 1139-1142.

PubMed   Google Scholar  

Carter YH, Shaw S, Sibbald B: Primary care research networks: an evolving model meriting national evaluation. British Journal of General Practice. 2000, 50: 859-860.

Trostle J: Research Capacity building and international health: Definitions, evaluations and strategies for success. Social Science and Medicine. 1992, 35: 1321-1324. 10.1016/0277-9536(92)90035-O.

Albert E, Mickan S: Closing the gap and widening the scope. New directions for research capacity building in primary health care. Australian Family Physician. 2002, 31: 1038 -10341.

Crisp BR, Swerissen H, Duckett SJ: Four approaches to capacity building in health: consequences for measurement and accountability. Health Promotion International. 2000, 15: 99-107. 10.1093/heapro/15.2.99.

Ryan , Wyke S: The evaluation of primary care research networks in Scotland. British Journal of General Practice. 2001, 154-155.

Gillies P: Effectiveness of alliances and partnerships for health promotion. Health Promotion International. 1998, 13: 99-120. 10.1093/heapro/13.2.99.

Pitkethly M, Sullivan F: Four years of TayRen, a primary care research and development network. Primary Care Research and Development. 2003, 4: 279-283. 10.1191/1463423603pc167oa.

Lester H, Carter YH, Dassu D, Hobbs F: Survey of research activity, training needs. departmental support, and career intentions of junior academic general practitioners. British Journal of General Practice. 1998, 48: 1322-1326.

North American Primary Care Research Group: What does it mean to build research capacity?. Family Medicine. 2002, 34: 678-684.

Smith R: Measuring the social impact of research. BMJ. 2001, 323: 528-10.1136/bmj.323.7312.528.

Article   CAS   PubMed   PubMed Central   Google Scholar  

Sarre G: Capacity and activity in research project (CARP): supporting R&D in primary care trusts. 2002

Del Mar C, Askew D: Building family/general practice research capacity. Annals of Family Medicine. 2004, 2: 535-540.

Carter YH, Shaw S, Macfarlane F: Primary Care research team assessment (PCRTA): development and evaluation. Occasional paper (Royal College of General Practitioners). 2002, 81: 1-72.

Jowett S, Macleod J, Wilson S, Hobbs F: Research in Primary Care: extent of involvement and perceived determinants among practitioners for one English region. British Journal of General Practice. 2000, 50: 387-389.

Cooke J, Owen J, Wilson A: Research and development at the health and social care interface in primary care: a scoping exercise in one National Health Service region. Health and Social Care in the Community. 2002, 10: 435 -4444. 10.1046/j.1365-2524.2002.00395.x.

Raghunath AS, Innes A: The case of multidisciplinary research in primary care. Primary Care Research and Development. 2004, 5: 265-273.

Reagans RZER: Networks, Diversity and Productivity: The Social Capital of Corporate R&D Teams. Organisational Science. 2001, 12: 502-517. 10.1287/orsc.12.4.502.10637.

Ovretveit J: Evaluating Health Interventions. 1998, Buckingham, Open University

Meyrick J, Sinkler P: An evaluation Resource for Healthy Living Centres. 1999, London, Health Education Authority

Hakansson A, Henriksson K, Isacsson A: Research methods courses for GPs: ten years' experience in southern Sweden. British Journal of General Practice. 2000, 50: 811-812.

Bacigalupo B, Cooke J, Hawley M: Research activity, interest and skills in a health and social care setting: a snapshot of a primary care trust in Northern England. Health and Social Care in the Community.

Kernick D: Evaluating primary care research networks - exposing a wider agenda. British Journal of General Practice. 2001, 51: 63-

Owen J, Cooke J: Developing research capacity and collaboration in primary care and social care: is there enough common ground?. Qualitative Social Work. 2004, 3: 398-410. 10.1177/1473325004048022.

Hurst: Building a research conscious workforce. Journal of Health Organization and management. 2003, 17: 373-384.

Gillibrand WP, Burton C, Watkins GG: Clinical networks for nursing research. International Nursing Review. 2002, 49: 188-193. 10.1046/j.1466-7657.2002.00124.x.

Campbell J, Longo D: Building research capacity in family medicine: Evaluation of the Grant Generating Project. Journal of Family Practice. 2002, 51: 593-

Cooke J, Nancarrow S, Hammersley V, Farndon L, Vernon W: The "Designated Research Team" approach to building research capacity in primary care. Primary Health Care Research and Development.

Innvaer S, Vist G, Trommald M, Oxman A: Health policy- makers' perceptions of their use of evidence: a systematic review. Journal of Health Services Research and Policy. 2002, 7: 239-244. 10.1258/135581902320432778.

NHS Service Delivery Organisation : NHS Service Delivery and Against National R&D programme, National listening exercise. 2000, London, NHS SDO

Hanley J, Bradburn S, Gorin M, Barnes M, Evans C, Goodare HB: Involving consumers in research and development in the NHS: briefing notes for researchers. 2000, Winchester, Consumers in NHS Research Support Unit,

Frenk J: Balancing relevance and excellence: organisational responses to link research with decision making. Social Science and Medicine. 1992, 35: 1397-1404. 10.1016/0277-9536(92)90043-P.

National Audit Office.: An international review on Governments' research procurement strategies. A paper in support of Getting the evidence: Using research in policy making. 2003, London, The Stationary Office.

Thomas P, While A: Increasing research capacity and changing the culture of primary care towards reflective inquiring practice: the experience of West London Research Network (WeLReN). Journal of Interprofessional Care. 2001, 15: 133-139. 10.1080/13561820120039865.

Rowlands G, Crilly T, Ashworth M, Mager J, Johns C, Hilton S: Linking research and development in primary care: primary care trusts, primary care research networks and primary care academics. Primary Care Research and Development. 2004, 5: 255-263. 10.1191/1463423604pc201oa.

Davies S: R&D for the NHS- Delivering the research agenda: ; London. 2005, National Coordinating Centre for Research Capacity Development

Department of Health.: Best Research for Best Health: A New National Health Research Strategy. The NHS contribution to health research in England: A consultation. 2005, London, Department of Health

Buxton M, Hanney S, Jones T: Estimating the economic value to societies of the impact of health research: a critical review. Bulletin of the World Health Organisation. 2004, 82: 733-739.

Department of Health.: The NHS as an innovative organisation: A framework and guidance on the management of intellectual property in the NHS. 2002, London, Department of Health

Sarre G: Trent Focus Supporting research and development in primary care organisations: report of the capacity and activity in research project (CARP). 2003

Department of Health: Research Governance Framework for Health and Social Care. 2001, London, Department of Health.

Hill J, Foster N, Hughes R, Hay E: Meeting the challenges of research governance. Rheumatology. 2005, 44: 571-572. 10.1093/rheumatology/keh579.

Shaw S: Developing research management and governance capacity in primary care organizations: transferable learning from a qualitative evaluation of UK pilot sites. Family Practice. 2004, 21: 92-98. 10.1093/fampra/cmh120.

Pre-publication history

The pre-publication history for this paper can be accessed here: http://www.biomedcentral.com/1471-2296/6/44/prepub

Download references

Acknowledgements

My warm thanks go to my colleagues in the primary care group of the Trent RDSU for reading and commenting on earlier drafts of this paper, and for their continued support in practice.

Author information

Authors and affiliations.

Primary Care and Social Care Lead, Trent Research and Development Unit, formerly, Trent Focus Group, ICOSS Building, The University of Sheffield, 219 Portobello, Sheffield, S1 4DP, UK

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Jo Cooke .

Additional information

Competing interests.

The author(s) declare that they have no competing interests.

Authors’ original submitted files for images

Below are the links to the authors’ original submitted files for images.

Authors’ original file for figure 1

Rights and permissions.

Open Access This article is published under license to BioMed Central Ltd. This is an Open Access article is distributed under the terms of the Creative Commons Attribution License ( https://creativecommons.org/licenses/by/2.0 ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Reprints and permissions

About this article

Cite this article.

Cooke, J. A framework to evaluate research capacity building in health care. BMC Fam Pract 6 , 44 (2005). https://doi.org/10.1186/1471-2296-6-44

Download citation

Received : 12 June 2005

Accepted : 27 October 2005

Published : 27 October 2005

DOI : https://doi.org/10.1186/1471-2296-6-44

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Capacity Building
  • Research Capacity
  • Research Skill
  • Traditional Outcome
  • Research Capacity Building

BMC Primary Care

ISSN: 2731-4553

research capability

Research management

Sponsored by

Elsevier logo

What does ‘strengthen research capacity’ actually mean, and how can we do it?

Leaders of research consortia face a difficult task in carrying out research and improving research capacity, but embedding a specialist team to make recommendations provides great gains

Imelda Bates

.css-76pyzs{margin-right:0.25rem;} ,, justin pulford, lorelei silvester.

  • More on this topic

Elsevier logo

Elsevier helps researchers and healthcare professionals advance science and improve health outcomes for the benefit of society.

Discover elsevier.

Strengthen research capacity: how do we do it?

You may also like

Randomised control trials are crucial for good science, but we can also improve their ethics

Popular resources

.css-1txxx8u{overflow:hidden;max-height:81px;text-indent:0px;} Emotions and learning: what role do emotions play in how and why students learn?

A diy guide to starting your own journal, universities, ai and the common good, artificial intelligence and academic integrity: striking a balance, create an onboarding programme for neurodivergent students.

The global crises of humanitarian disasters, climate change and global health demand a concerted international effort. But poorer countries, while facing the most acute crises, have the least capacity to play their part in the global research effort. In these countries, organisations that produce high-quality research are authorities’ go-to centres for solving local and national problems. Consequently, research funders with a development agenda are increasingly requiring their large research programmes – often operating through multi-partner international research consortia – to have dual goals: to carry out research and contribute to strengthening organisations’ research capacity.

Research consortia bring together several organisations, often across different countries, to tackle a specific complex research problem over, say, three to five years . The partner organisations within a consortium are strategically selected by the consortium leaders to have diverse but complementary expertise that can be applied to the research problem. The type of partner organisations selected for an international research consortium are predominantly university departments or research institutes but may also include, for example, non-governmental organisations (NGOs), government agencies or commercial companies.

  • Lessons from completing an award-winning knowledge transfer project
  • How to develop a code of conduct for ethical research fieldwork
  • The formula for a successful knowledge exchange programme

The size and breadth of consortia means that in addition to doing the research, they  can use their pooled resources to strengthen the infrastructure and workforce of their less-well-resourced partner organisations. This makes research consortia an attractive model for funders that invest in enhancing research capacity. For example, senior researchers in a consortium can provide training and mentoring in many disciplines and research methods, while equipment and technical skills can be shared among the partner organisations and gaps in organisations’ professional research services (such as finance and IT) can be filled by consortia-funded staff.

Excellent researchers often become consortium leaders because they are experts in their research topics; they are also likely to be good at supervising PhD students and early career researchers. They may even be great communicators or managers. But they are unlikely to also have the qualitative skills and in-depth knowledge of concepts and tools needed to transform organisations’ research systems. We – rightly – expect research outputs to be excellent; surely we should expect the systems and processes producing that research to be excellent too? Excellent in that they are conducive to the well-being of the whole research workforce and that they favour collegiality and impactful research over “publish or perish” drivers.

Yet, practical guidance to help consortium leaders assess and enhance research capacity is scanty and rarely based on evidence. The evidence that is available is generally of poor quality and peppered with anecdotal self-assessments. It is scattered across disciplines and heavily biased towards enhancing individuals’ research skills rather than organisational change.

Researchers’ difficulties in chasing dual goals are compounded by funders’ reticence to be explicit about their expectations for “strengthening research capacity” and by opaqueness about how they will evaluate any improvements. For the most part, this is all left to the research leaders’ discretion.

Funders may also be struggling with how to be more decisive. There is no agreement on or consensus understanding of “research capacity”, even within the research community. There are almost no practical and validated indicators they can use to judge success – perhaps because minimal research has been commissioned to find out. 

This ambiguity on the part of funders means that, instead of their programmes being used as an opportunity to learn how to do better, there is ongoing reinvention of many wheels and only snail’s pace forward motion. Not optimising learning opportunities wastes time and money. Worse still, it risks raising, and then not fulfilling, the expectations of partner organisations. Disillusionment and distrust may follow.

It is not difficult – or even expensive – to break the cycle of over-ambitious expectations being placed on research consortia to transform organisations’ research systems, which leads to the current underwhelming outcomes. But it will require a seismic shift in how consortia-based programmes are conceived and operate.

First, consortia leaders – who in our experience are absolutely committed to facilitating better research systems – should be left to focus on their research and their consortia.

Second, a specialist team should be embedded within programmes to sensitively listen to everyone’s voice, to learn what works and find out what doesn’t and why. This team should have the tools and skills to systematically assess research capacity and, in partnership with organisations, develop bespoke action plans to fill the gaps. Team members should have expertise in research systems and social science and be up to date with the latest evidence for strengthening organisations’ research capacity. They should be respectful of different cultures and contexts and be skilled in conducting interviews and analysing qualitative data.

The role of this embedded team is to make real-time, impartial and research-informed recommendations about how to improve approaches to strengthening research systems within and beyond a programme’s lifetime. Programme managers, funders and consortium leaders can then work together to make adjustments based on the team’s recommendations. This clear demonstration of responsiveness to feedback by consortium and programme leaders builds trust and also helps makes consortium members feel valued. They become empowered to drive more changes.

Having such “learning teams” embedded in multiple programmes will produce a step change in the speed and breadth of new global knowledge. They add substantial value for little additional cost. Their outputs will quickly help answer key questions such as how best to transform organisations’ research systems, what works in what situations and how to sustain the benefits. They will also highlight the breadth of research strengths in the organisations – information that public policy units can use to raise their organisations’ profile with policymakers and demonstrate their role as a key player in their local economy.

Embedded learning teams do work in practice. But there are some underlying principles to be aware of. They should sit at the interface of the consortia members, programme managers and research funders – but operate independently of them all. They should have unfettered access to consortia members. They need to take a systems approach, be trustworthy and respectful and be good at gathering information from interviews. Their recommendations must promote sustainability of capacity gains and guide decision-making – their methods must therefore stand up to scrutiny so that their findings are robust and publishable. Rapid mechanisms must be in place across the programme for acting on the team’s recommendations. Most importantly, the learning and subsequent actions should be focused on the needs of the beneficiary organisations. The actions should be led, owned and sustained by them.

Research consortia could be a powerful mechanism for sustainably strengthening organisations’ research capacity and for creating supportive environments for their research workforce. To realise this potential, funders should be more proactive in requiring consortia to carry out systematic, sustainable research capacity strengthening and in evaluating the outcomes. Consortium leaders need more practical support, including from an embedded specialist team, to fulfil these requirements. By working together, those who fund and lead research can accelerate learning about carrying out better research capacity strengthening, which will ultimately empower organisations in poorer countries to make a more equitable contribution to the national and global research effort.

Imelda Bates is the head of the Centre for Capacity Research, Liverpool School of Tropical Medicine. She focuses on enhancing research capacity in institutions and on highlighting and valuing the role that research support professionals and laboratories play in the research effort.

Justin Pulford is a senior lecturer and deputy head of the Centre for Capacity Research, Liverpool School of Tropical Medicine. His primary interest is the production and uptake of evidence in support of research systems strengthening.

Lorelei Silvester is programmes manager at the Centre for Capacity Research, Liverpool School of Tropical Medicine . She is experienced in collaborating with partner institutions to identify research capacity gaps and helps to develop and monitor robust, practical action plans that deliver tangible benefits.

They are all  members of the Universities Policy Engagement Network (UPEN)

If you would like advice and insight from academics and university staff delivered direct to your inbox each week,  sign up for the Campus newsletter .

Emotions and learning: what role do emotions play in how and why students learn?

Global perspectives: navigating challenges in higher education across borders, how to help young women see themselves as coders, contextual learning: linking learning to the real world, authentic assessment in higher education and the role of digital creative technologies, how hard can it be testing ai detection tools.

Register for free

and unlock a host of features on the THE site

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List

Logo of f1000res

Measuring the outcome and impact of research capacity strengthening initiatives: A review of indicators used or described in the published and grey literature

Justin pulford.

1 Department of International Public Health, Liverpool School of Tropical Medicine, Liverpool, L3 5QA, UK

Natasha Price

Jessica amegee quach, imelda bates, associated data.

  • Pulford J, Price N, Amegee J, et al.: List of RCS Outcome Indicators.xlsx. Measuring the outcome and impact of research capacity strengthening initiatives: A review of indicators used or described in the published and grey literature - Full listing of retrieved RCS indicators. V1 ed: Harvard Dataverse;2020.

Underlying data

Harvard Dataverse: Measuring the outcome and impact of research capacity strengthening initiatives: A review of indicators used or described in the published and grey literature - Full listing of retrieved RCS indicators. https://doi.org/10.7910/DVN/K6GIGX 18 .

This project contains the following underlying data:

  • List of RCS Impact Indicators
  • List of RCS Outcome Indicators
  • List of RCS Output Indicators
  • List of Source Documents

Data are available under the terms of the Creative Commons Zero "No rights reserved" data waiver (CC0 1.0 Public domain dedication).

Peer Review Summary

Background: Development partners and research councils are increasingly investing in research capacity strengthening initiatives in low- and middle-income countries to support sustainable research systems. However, there are few reported evaluations of research capacity strengthening initiatives and no agreed evaluation metrics.

Methods: To advance progress towards a standardised set of outcome and impact indicators, this paper presents a structured review of research capacity strengthening indicators described in the published and grey literature.

Results: We identified a total of 668 indicators of which 40% measured output, 59.5% outcome and 0.5% impact. Only 1% of outcome and impact indicators met all four quality criteria applied. A majority (63%) of reported outcome indicators clustered in four focal areas, including: research management and support (97/400), the attainment and application of new research skills and knowledge (62/400), research collaboration (53/400), and knowledge transfer (39/400).

Conclusions: Whilst this review identified few examples of quality research capacity strengthening indicators, it has identified priority focal areas in which outcome and impact indicators could be developed as well as a small set of ‘candidate’ indicators that could form the basis of development efforts.

Introduction

Research capacity strengthening (RCS) has been defined as the “process of individual and institutional development which leads to higher levels of skills and greater ability to perform useful research” 1 . National capacity to generate robust, innovative and locally appropriate research is considered essential to population health 2 , 3 and socioeconomic development 4 , 5 . However, wide global disparities in research capacity and productivity currently exist: South Asian countries account for 23% of the World’s population yet produced less than 5% of the global output of scientific publications in 2013 6 ; and sub-Saharan Africa (accounting for 13% of the global population), contributes 1% of global investment in research and development and holds 0.1% of global patents 6 . Accordingly, international development partners and research funding bodies are increasingly investing in RCS initiatives in low- and middle-income countries (LMICs). The UK Collaborative on Development Research predicts the United Kingdom’s total aid spend on research will rise to £1.2 billion by 2021 7 , a large proportion of which would be direct or indirect investment in RCS in LMICs. The total global spend on RCS in LMICs, while not yet calculated, would likely be many times this figure.

Despite this substantial investment, few robust evaluations of RCS initiatives in LMIC contexts have been presented in the published or grey literatures with the available evidence base characterised by reflective, largely qualitative individual case studies or commentaries 8 . RCS evaluation frameworks have been described 9 – 11 , but a comprehensive set of standard outcome or impact indicators have not been agreed and common indicators are used inconsistently. For example, publication count has been used as both an output 12 and outcome indicator 13 sometimes with 14 or without 10 accounting for publication quality.

The dearth of robust RCS programme evaluation and, more fundamentally, robust evaluation metrics available for consistent application across RCS programmes, has contributed to a paradoxical situation in which investments designed to strengthen the quantity, quality and impact of locally produced research in LMIC settings are themselves hindered by a lack of supporting evidence. As a substantial proportion of RCS investment is derived from publicly funded development assistance 15 – 17 , then ensuring the means to reliably evaluate impact and value for money of research and health system investments assumes even further importance.

This paper aims to advance progress towards the establishment of a standardised set of outcome and impact indicators for use across RCS initiatives in LMIC contexts. As a first step towards this goal, a systematic review of RCS outcome and impact indicators previously described in the published and grey literatures is presented. The review findings highlight the range, type and quality of RCS indicators currently available and allows inconsistencies, duplications, overlaps and gaps to be identified. These results may then be used to inform planning and decision making regarding the selection and/or development of standard RCS evaluation metrics. In the interim, the resulting list of indicators may also serve as a useful resource for RCS programme funders, managers and evaluators as they design their respective monitoring and evaluation frameworks.

Search strategy and study selection

Peer reviewed publications were sought via the following databases: PubMed, Global Health, CINAHL Complete and International Bibliography of the Social Sciences (IBSS). The search was limited to English language publications and was conducted using the keywords: (research capacity) AND (develop* OR build* OR strengthen*) AND (indicator) AND (monitoring OR evaluation). The search was conducted without date limitations up until March 2018. Following removal of duplicates, all retrieved publications were subject to an initial review of the title, abstract and listed keywords. Publications that met, or were suggestive of meeting, the inclusion criteria were then subjected to full text review. Publications subjected to full text review met the inclusion criteria if they: were peer-reviewed; pertained to ‘research capacity’ (as either a primary or secondary focus); and included at least one output, outcome or impact indicator that has been used to measure research capacity or was proposed as a possible measure of research capacity.

The search was supplemented by a manual review of the references listed in each paper that met the final inclusion criteria and by a citation search using first author names for all papers which met the final inclusion criteria from both the initial electronic and supplementary manual searches. A further 19 papers which met the inclusion criteria were identified in this way and included in the review.

Relevant grey literature was then sought via the following databases: Google Advanced, BASE, Grey Literature and OpenGrey. The same search terms and inclusion criteria as described above were used. This search was supplemented by a request circulated across the authors’ personal networks for relevant research reports pertaining to RCS evaluation which may fit the inclusion criteria. There were seven reports identified this way, resulting in a final sample of 25 publications and seven reports. Figure 1 depicts the overall process and outcomes from this search strategy.

An external file that holds a picture, illustration, etc.
Object name is f1000research-9-26632-g0000.jpg

Data extraction

Research capacity strengthening indicator descriptions and definitions were extracted from each publication/report and recorded verbatim in an Excel spreadsheet (see Underlying data ) 18 . Other information recorded alongside each indicator included: the type of indicator (output, outcome or impact) ( Box 1 ); the level of measurement (individual research capacity; institutional research capacity; or systemic research capacity); source information (author, year and title of publication/report); and a brief summary of the context in which the indicator was applied. Designation of indicator type (output, outcome or impact) and level of measurement (individual, institutional or systemic) were based on those ascribed by the author/s when reported. Where indicator type and measurement level were not reported, we used our own judgement drawing on the reported context from the respective publication/report.

Some publications/reports used the same indicators across different levels (i.e. as both an individual and an institutional measure) and in these cases we reported the indicator at a single level only based on apparent best fit. However, if the same publication reported the same indicator as both an output and an outcome measure, then it was reported twice. Where there was variation between the way that one publication or another classified an indicator (e.g. the same indicator being described as an ‘output’ indicator in one publication and an ‘outcome’ indicator in another), we remained true to the texts and recorded each separately. Indicators that pertained to the evaluation of course materials or content (e.g. how useful were the PowerPoint slides provided?) were excluded from analysis, although indicators that focused on the outcome of course attendance were retained.

Defining output, outcome, and impact indicators

Output indicators - defined as measures of programme or project activities that are directly controllable by the RCS initiative (e.g. number of infectious disease experts from country X training in academic writing).

Outcome indicators - defined as measures of change in behaviour or performance, in the short- to mid-term, that could reasonably be attributed to the RCS initiative in full or large part (e.g. number of manuscripts published by infectious disease experts from country X following an academic writing course).

Impact indicators - defined as measures of longer-term change that may not be directly attributable to the RCS initiative but directly relate to the overarching aims of the RCS initiative (e.g. reduction in infectious disease mortality in country X).

Data analysis

Once all listed indicators from across the 32 publications and reports had been entered into the Excel spreadsheet, the research team coded all outcome and impact indicators according to their respective focus (i.e. the focus of the indicated measure, such as publication count or grant submissions) and quality. Output indicators were excluded from further analysis. Indicators were coded independently by two researchers, checking consistency and resolving discrepancies through discussion and, if necessary, by bringing in a third reviewer. ‘Focus’ codes were emergent and were based on stated or implied focal area of each indicator. ‘Quality’ was coded against four pre-determined criteria: 1) a measure for the stated indicator was at least implied in the indicator description; 2) the measure was clearly defined; 3) the defined measure was sensitive to change; and 4) the defined measure was time-bound (thus, criteria 2 is only applied if criteria 1 is met and criteria 3 and 4 are only applied if criteria 2 is met).

Type and level of identified indicators

We identified a total of 668 reported or described indicators of research capacity from across the 32 publications or reports included in the review. Of these, 40% (265/668) were output indicators, 59.5% (400/668) were outcome indicators and 0.5% (3/668) were impact indicators. A total of 34% (225/668) of these indicators were measures of individual research capacity, 38% (265/668) were measures of institutional research capacity and 21% (178/668) were systemic measures of research capacity. Figure 2 illustrates the spread of indicator type across these three categories by level. The full list of 668 indicators, inclusive of source information, is available as Underlying data 18 .

An external file that holds a picture, illustration, etc.
Object name is f1000research-9-26632-g0001.jpg

Outcome indicators

The 400 outcome indicators were subsequently coded to nine thematic categories and 36 sub-categories, as described in Box 2 . The categories and the total number of indicators in each (across all three levels) were as follows: research management and support (n=97), skills/knowledge (n=62), collaboration activities (n=53), knowledge translation (n=39), bibliometrics (n=31), research funding (n=25), recognition (n=11), infrastructure (n=5) and other (n=77). Figure 3 depicts the number of outcome indicators by category and level.

Outcome indicator categories and sub-categories

1. Bibliometrics : Indicators relating to the development, publication and use of written outputs such as peer reviewed journal articles.

    Sub-categories: peer reviewed publication; publication (any form of publication other than peer review); reference (e.g. records of citations); quality (e.g. rating by impact factor).

2. Collaboration Activities : Indicators relating to networking, collaborating, mentoring type activities.

    Sub-categories: engagement (evidence of working collaboratively); establishment (creating new networks, collaborations); experience (e.g. perception of equity in a specific partnership).

3. Infrastructure : Indicators relating to research infrastructure including buildings, labs, equipment, libraries and other physical resources.

    Sub-categories: suitability (the provision of adequate facilities for research); procurement (e.g. purchase of laboratory equipment).

4. Knowledge translation : Indicators relating to the dissemination of research and knowledge, including conferences, media and public education/outreach.

    Sub-categories: dissemination (examples of research being communicated to different audiences); influence (using research knowledge to influence policy, the commissioning of new research, etc).

5. Recognition :Indicators relating to professional or institutional esteem.

    Sub-categories: Appointment (e.g. appointed to leadership positions); Awards (i.e. receiving an award); reputation (e.g. invited keynote address).

6. Research funding : Indicators relating to funding for research.

    Sub-categories: funds received (e.g. competitive grants); allocation (e.g. allocate budget to support local research); expenditure (use of research funds); access (access to research funding/competitive awards).

7. Research Management & Support (RMS) : Indicators relating to the administration of university or research institution systems that make research possible (e.g. finance, ITC and project management).

    Sub-categories: career support (e.g. working conditions, salary and career development); organisation capacity (to manage/support research); research investment; resource access (e.g. to IT, libraries etc); sustainability (of RMS); governance (e.g. formation of ethics review committees); national capacity (to support research); national planning (e.g. developing national research priorities).

8. Skills/training activities : Indicators relating to training and educational activities relating to research or research subject area knowledge.

    Sub-categories: attainment (of new skills); application (of new skills); transfer (of new skills).

9. Other : Indicators relating to any area other than the eight described above.

    Sub-categories: research quality (e.g. quality of work undertaken); research production (e.g. increase in research activity); research process (e.g. inclusion of new methods or techniques); research workforce (e.g. growth in number of researchers); career advancement (e.g. promotion); equity (e.g. gender equity); miscellaneous.

An external file that holds a picture, illustration, etc.
Object name is f1000research-9-26632-g0002.jpg

Table 1 – Table 3 present the number of outcome indicators in each sub-category as well as an example indicator for each, by the three respective research capacity levels (individual, institutional and systemic). The category and sub-category designation assigned to all 400 outcome indicators are available as Underlying data 18 .

Table 4 presents the percentage of outcome indicators that met each of the four quality measures as well as the percentage that met all four quality indicators by indicator category. As shown, all outcome indicators implied a measurement focus (e.g. received a national grant or time spent on research activities), 21% presented a defined measure (e.g. had at least one publication), 13% presented a defined measure sensitive to change (e.g. number of publications presented in peer reviewed journals) and 5% presented a defined measure, sensitive to change and time bound (e.g. number of competitive grants won per year). Only 1% (6/400) of outcome indicators met all four quality criteria including: 1) Completed research projects written up and submitted to peer reviewed journals within 4 weeks of the course end; 2) Number of competitive grants won per year (independently or as a part of a team); 3) Number and evidence of projects transitioned to and sustained by institutions, organizations or agencies for at least two years; 4) Proportion of females among grantees/contract recipients (over total number and total funding); 5) Proportion of [Tropical Disease Research] grants/contracts awarded to [Disease Endemic Country] (over total number and total funding); and 6) Proportion of [Tropical Disease Research] grants/contracts awarded to low-income countries (over total number and total funding). Indicators pertaining to research funding and bibliometrics scored highest on the quality measures whereas indicators pertaining to research management and support and collaboration activities scored the lowest.

Impact indicators

The three impact indicators were all systemic-level indicators and were all coded to a ‘health and wellbeing’ theme; two to a sub-category of ‘people’, one to a sub-category of ‘disease’. The three impact indicators were: 1) Contribution to health of populations served; 2) Impact of project on patients' quality of life, including social capital and health gain; and 3) Estimated impact on disease control and prevention. All three met the ‘implied measure’ quality criteria. No indicators met any of the remaining three quality criteria.

This paper sought to inform the development of standardised RCS evaluation metrics through a systematic review of RCS indicators previously described in the published and grey literatures. The review found a spread between individual- (34%), institutional- (38%) and systemic-level (21%) indicators, implying both a need and interest in RCS metrics across all levels of the research system. This is consistent with contemporary RCS frameworks 10 , 19 , although the high proportion of institutional-level indicators is somewhat surprising given the continued predominance of individual-level RCS initiatives and activities such as scholarship provision, individual skills training and research-centred RCS consortia 20 .

Outcome indicators were the most common indicator type identified by the review, accounting for 59.5% (400/669) of the total. However, the large number of outcome indicators were subsequently assigned to a relatively small number of post-coded thematic categories (n=9), suggestive of considerable overlap and duplication among the existing indicator stock. Just under two-thirds of the outcome indicators pertained to four thematic domains (research management and support, skills/knowledge attainment or application, collaboration activities and knowledge translation) suggesting an even narrower focus in practice. It is not possible to determine on the basis of this review whether the relatively narrow focus of the reported indicators is reflective of greater interest in these areas or practical issues pertaining to outcome measurement (e.g. these domains may be inherently easier to measure); however, if standardised indicators in these key focal areas are identified and agreed, then they are likely to hold wide appeal.

The near absence of impact indicators is a finding of significant note, highlighting a lack of long-term evaluation of RCS interventions 8 as well as the inherent complexity in attempting to evaluate a multifaceted, long-term, continuous process subject to a diverse range of influences and assumptions. Theoretical models for evaluating complex interventions have been developed 33 , as have broad guidelines for applied evaluation of complex interventions 34 ; thus, the notion of evaluating ‘impact’ of RCS investment is not beyond the reach of contemporary evaluation science and evaluation frameworks tailored for RCS interventions have been proposed 11 . Attempting to measure RCS impact by classic, linear evaluation methodologies via precise, quantifiable metrics may not be the best path forward. However, the general dearth of any form of RCS impact indicator (as revealed in this review) or robust evaluative investigation 8 , 20 suggests an urgent need for investment in RCS evaluation frameworks and methodologies irrespective of typology.

The quality of retrieved indicators, as assessed by four specified criteria (measure for the stated indicator was implied by indicator description; measure clearly defined; defined measure was sensitive to change; and defined measure was time-bound) was uniformly poor. Only 1% (6/400) of outcome indicators and none of the impact indicators met all four criteria. Quality ratings were highest amongst indicators focused on measuring research funding or bibliometrics and lowest amongst research management and support and collaboration activities. This most likely reflects differences in the relative complexity of attempting to measure capacity gain across these different domain types; however, as ‘research management and support’ and ‘collaboration activity’ indicators were two of the most common outcome indicator types, this finding suggests that the quality of measurement is poorest in the RCS domains of most apparent interest. The quality data further suggest that RCS indicators retrieved by the review were most commonly (by design or otherwise) ‘expressions’ of the types of RCS outcomes that would be worthwhile measuring as opposed to well defined RCS metrics. For example, ‘links between research activities and national priorities’ 19 or ‘ease of access to research undertaken locally’ 22 are areas in which RCS outcome could be assessed, yet precise metrics to do so remain undescribed.

Despite the quality issues, it is possible to draw potential ‘candidate’ outcome indicators for each focal area, and at each research capacity level, from the amalgamated list (see Underlying data ) 18 . These candidate indicators could then be further developed or refined through remote decision-making processes, such as those applied to develop other indicator sets 37 , or through a dedicated conference or workshop as often used to determine health research priorities 38 . The same processes could also be used to identify potential impact indicators and/or additional focal areas and associated indicators for either outcome or impact assessment. Dedicated, inclusive and broad consultation of this type would appear to be an essential next step towards the development of a comprehensive set of standardised, widely applicable RCS outcome and impact indicators given the review findings.

Limitations

RCS is a broad, multi-disciplinary endeavour without a standardised definition, lexicon or discipline-specific journals 8 . As such, relevant literature may have gone undetected by the search methodology. Similarly, it is quite likely that numerous RCS outcome or impact indicators exist solely in project specific log frames or other forms of project-specific documentation not accessible in the public domain or not readily accessible by conventional literature search methodologies. Furthermore, RCS outcome or impact indicators presented in a language other than English were excluded from review. The review findings, therefore, are unlikely to represent the complete collection of RCS indicators used by programme implementers and/or potentially accessible in the public domain. The quality measurement criteria were limited in scope, not accounting for factors such as relevance or feasibility, and were biased towards quantitative indicators. Qualitative indicators would have scored poorly by default. Nevertheless, the review findings represent the most comprehensive listing of currently available RCS indicators compiled to date (to the best of the authors’ knowledge) and the indicators retrieved are highly likely to be reflective of the range, type and quality of indicators in current use, even if not identified by the search methodology.

Numerous RCS outcome indicators are present in the public and grey literature, although across a relatively limited range. This suggests significant overlap and duplication in currently reported outcome indicators as well as common interest in key focal areas. Very few impact indicators were identified by this review and the quality of all indicators, both outcome and impact, was uniformly poor. Thus, on the basis of this review, it is possible to identify priority focal areas in which outcome and impact indicators could be developed, namely: research management and support, the attainment and application of new skills and knowledge, research collaboration and knowledge transfer. However, good examples of indicators in each of these areas now need to be developed. Priority next steps would be to identify and refine standardised outcome indicators in the focal areas of common interest, drawing on the best candidate indicators among those currently in use, and proposing potential impact indicators for subsequent testing and application.

Data availability

[version 1; peer review: 4 approved]

Funding Statement

This work was funded by the American Thoracic Society.

The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Reviewer response for version 1

Meriel flint-o'kane.

1 Faculty of Public Health and Policy, London School of Hygiene and Tropical Medicine, London, UK

Summary: This paper provides a review and analysis of the quality and suitability of M&E metrics for funding that is allocated to strengthen research capacity in LMIC's. Published and grey literature has been reviewed to identify indicators used to measure the outputs, outcomes and impacts of relevant programmes and the findings have been assessed in terms of content and quality. The authors conclude that the outcome indicators identified were of low quality and impact indicators are almost always missing from RCS MEL frameworks and recommend further work to develop appropriate indicators to measure the outcomes and impacts of research capacity strengthening programmes/activities. Through the review of existing outcome indicators the authors have identified four focal areas against which indicators could be developed.

  • The search strategy and study selection is clearly described and links to source data are available. Data extraction and analysis methods are also clearly described.
  • No major points to address. This work is by a leading team in the field of RCS research and makes a useful contribution to the literature in providing a thorough review of indicators used to monitor and evaluate work funded to strengthen research capacity.
  • Though the article is not focused on health research, health is specifically referred to in a few places throughout the article e.g. line 4 (and corresponding references) of the introduction, Research Funding example indicators in Table 2, RMS example indicators in Table 3, numbers 5 and 6 of the 6 outcome indicators meeting all four quality criteria refer to TDR, and the impact indicators are all acknowledged as being specific to health and wellbeing. It would be interesting to understand which other research disciplines featured in the literature reviewed, and the spread of results across disciplines in order that analysis and findings could indicate if there is variety between and/or within disciplines in approaches to MEL for RCS and what can be learnt from this.
  • No background, references or justification is given for the pre-determined 'quality' criteria.
  • The authors note in 'Limitations' that project documents were not available in the public domain and documents not in English were excluded. Further reflection on to what extent Log Frames, ToC's and other MEL docs  for programmes that have RCS as a primary or secondary outcome would be in the public domain could be helpful e.g. is it common for delivery partners of RCS programmes to make their MEL docs publicly available? If not, are these indicators representative of those currently/recently being used by actors in the RCS programme delivery space or do they represent a subset that is more likely to have publicly available data?

Is the work clearly and accurately presented and does it cite the current literature?

If applicable, is the statistical analysis and its interpretation appropriate?

Not applicable

Are all the source data underlying the results available to ensure full reproducibility?

Is the study design appropriate and is the work technically sound?

Are the conclusions drawn adequately supported by the results?

Are sufficient details of methods and analysis provided to allow replication by others?

Reviewer Expertise:

Global health, research capacity strengthening, higher education capacity

I confirm that I have read this submission and believe that I have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

Francesco Obino

1 Global Development Network, New Delhi, Delhi, India

Daniel Fussy

The article is an timely contribution to an urgent question: how do we know if research capacity strengthening is working? The analysis of the problem (a. the lack of a shared reference framework for evaluating research capacity strengthening, which in turn implies that b. the scope for systematic and cumulative learning remains limited) is convincing and valid. The methodology is clearly explained and up to existing standards and expectations for this kind of exercise. The conclusions are straightforward, and the limitations well articulated (the focus on English, and the bias towards quantitative measures being the most important ones.)

A few overall comments for the authors, keeping in mind the 'agenda' the article is trying to support (i.e. developing good examples of RCS indicators), and its potential uptake:

  • RCS lack definition too, not just indicators. The article does not differentiate between research capacity strengthening done at the national level and at the international level, or in different fields (health sciences vs social sciences, etc.). While this is key to the aim of the paper to 'survey' existing indicators, the lack of solid evaluation of RCS can be also understood as the result not so much of 'underdevelopment' of the field, but of its overdevelopment in the absence of a shared definition of what RCS is. In this sense, putting all RCS (indicators) in the 'same box' might in fact reinforce the confusion around what is there to be measured, and how. International donor-funded, project-based RCS efforts differ (in scope, objectives and means) from the RCS effort of a science council or a local research training institution - despite overlaps. Often, the difference in objectives might make indicators hard to include in the same box. In this sense, the paper should acknowledge the lack of a shared definition of RCS, and the limitation it poses to an analysis of indicators. For this specific article, it might be useful to define RCS as international, donor-funded, project-based set of activities. Arguably, the very need of a discussion on RCS evaluation is largely driven by the fact that RCS is part of the evaluation-heavy international donor sector. This might help further defining the relevant timeframe for the search, and situating RCS historically.
  • RCS is more than the sum of quality outputs. I wonder about the lack of discussion on 'process indicators' given the nature of RCS as a set of activities. These are notoriously difficult (but not impossible) to use in the donor-funded, project-based, time-bound RCS efforts, but might be very relevant to describe change and ultimately impact.
  • RCS impacts research systems, policy, or development? When it comes to discussion of impacts and impact indicators, the lack of definition of RCS becomes an insurmountable limitation. The study could acknowledge the need for unpacking the link between output, outcome and impact measurement/definition (particularly in light of lack of shared definition of RCS) in internationally funded programs, as a complementary exercise to the surveying of indicators. The fact that the very few impact indicators identified reveal an expectation for RCS to deliver impact on population health outcomes is a good example of the limitations imposed by lack of clear definitions.
  • How important is the UK? Given the global audience of the piece, it might be useful to explain why the figures relating to projected RCS funding from the UK are significant to describe larger trends - particularly if figures include both 'direct' and 'indirect' RCS.

Research capacity building methodologies, political theory, international relations

We confirm that we have read this submission and believe that we have an appropriate level of expertise to confirm that it is of an acceptable scientific standard.

Peter Taylor

1 Institute of Development Studies, Brighton, UK

The article addresses an issue that is receiving renewed interest in recent years - research capacity strengthening (RCS), and the particular challenge of evaluating outputs, outcomes and impacts of RCS initiatives.

The study undertook a structured review of RCS indicators in the published and grey literature. Key findings included the identification of rather few examples of quality RCS, with emphasis on four focal areas (research management and support, skill and knowledge development, research collaboration and knowledge transfer. The study concludes that there is significant room for development of indicators, and consequently the potential adoption of these to allow a more systematic approach to RCS approaches and to their subsequent evaluation.

The study is clearly presented and has a solid methodology. The validity of the findings rest on the extent to which the systematic review did identify published material that engages in this issue. As the authors note, it is likely that there is a wider body of grey literature in the form of project and program reports that were not located through the search. This suggests that there is need for more published work on this topic (making the paper therefore relevant and useful), and perhaps reinforces a wider view that many RCS efforts are inadequately evaluated (or not evaluated at all). An earlier World Bank Institute report on evaluation of training (Taschereau, 2010 1 ), for example, had highlighted challenges in evaluation of the impact of training and institutional development programs. The study refers briefly to RCS interventions, taking training as an example, but this only related to training which makes up a small percentage of the overall efforts towards RCS.

It would be very interesting to situate this welcome study in the context of broader discussions and debates on RCS, particularly as a contribution to theory and practice at strengthening research capacity at individual, organizational and system levels. The latter of these is the most complex to conceptualise, to implement, and to measure, and is receiving valuable attention from RCS stakeholders such as the Global Development Network (GDN, 2017 2 ) through their Doing Research Program - a growing source of literature for subsequent review.

As the authors of the study note, there is a danger in identifying RCS indicators that are seen as having universal application and attractiveness because they are relatively easy to measure. There is an equal, related danger that, due to relative measurability, a majority of RCS interventions become so streamlined in terms of their approach that they begin to follow recipe or blueprint approaches.

The study is agnostic on different approaches to RCS. Work undertaken by the Think Tank Initiative (TTI) for example (Weyrauch, 2014 3 ) has demonstrated a range of useful RCS approaches, including flexible financial support, accompanied learning supported by trusted advisors/program officers, action learning, training and others. In a final evaluation of the Think Tank Initiative (Christoplos et al. , 2019 4 ), training was viewed as having had the least value amongst several intervention types in terms of RCS outcomes, whilst flexible financial support and accompanied learning processes were viewed as being significantly more effective. It would be interesting to identify indicators of outcomes or even impacts that might relate to different types of RCS interventions which were not included in the publications reviewed by this study.

A key indicator of RCS identified by the TTI evaluation, which interestingly does not appear explicitly in the indicator list of this study, was leadership. As the authors indicate, there are likely to be other valuable indicators not surfaced through this review and this requires more work.

This study offers a very important contribution to a field currently being reinvigorated and is highly welcome. Rather than being valued because it may potentially offer a future blueprint list of indicators, (not least since, as the authors observe, the indicator list generated in this study is partial in comparison to a much wider potential range), its value lies particularly in its potential for contribution to further debate and dialogue on the theory and practice of RCS interventions and their evaluation; this dialogue can in turn be further informed by access to a more diverse set of grey literature and by engagement with stakeholders who have experience and interest in strengthening this work. Hopefully the authors of this study, and other researchers, will continue this important line of work and promote ongoing discussion and debate.

International development, organizational learning and development, research capacity strengthening

Erica Di Ruggiero

1 Dalla Lana School of Public Health, University of Toronto, Toronto, ON, Canada

  • The article outlines clear research questions and methods and provides a very useful summary of findings from a structured review of research capacity strengthening indicators for use in low- and middle-income countries (LMICs). Terminology is overall quite clearly defined. Some greater precision relate to the use of the terminology: context for indicator's application; knowledge transfer vs. knowledge translation are used.
  • The definition of knowledge translation provided seems limiting and perhaps the authors meant only transfer. It would have been helpful to have some descriptive data on the sources of the publications/reports from which the indicators were derived (i.e. were they all published by academics vs. any from research funders). For example, it's unclear if any indicators developed by funders such as the International Development Research Centre and others that support LMIC research are included.
  • The limitations section is clear.
  • It would have been helpful to have the authors elaborate a bit more on the dearth of qualitative indicators, appreciating the fact that they would have 'scored poorly by default' because of the methodology used. Could the authors comment in the conclusion on areas for indicator development (like qualitative indicators; equity-related indicators - for e.g. I note that perception of equity in a specific partnership was part of the definition for collaboration and in the 'other' category, but to my knowledge, equity didn't really appear elsewhere)?

Public health research; evaluation

  • Research article
  • Open access
  • Published: 15 September 2018

Research capacity building frameworks for allied health professionals – a systematic review

  • Janine Matus   ORCID: orcid.org/0000-0002-3067-8870 1 ,
  • Ashlea Walker 1 &
  • Sharon Mickan 1 , 2  

BMC Health Services Research volume  18 , Article number:  716 ( 2018 ) Cite this article

18k Accesses

73 Citations

45 Altmetric

Metrics details

Building the capacity of allied health professionals to engage in research has been recognised as a priority due to the many benefits it brings for patients, healthcare professionals, healthcare organisations and society more broadly. There is increasing recognition of the need for a coordinated multi-strategy approach to building research capacity. The aim of this systematic review was to identify existing integrated models and frameworks which guide research capacity building for allied health professionals working in publicly funded secondary and tertiary healthcare organisations.

A systematic review was undertaken searching five databases (Medline, CINAHL, Embase, AustHealth and Web of Science) using English language restrictions. Two authors independently screened and reviewed studies, extracted data and performed quality assessments using the Mixed Methods Appraisal Tool. Content and thematic analysis methods were used to code and categorise the data.

A total of 8492 unique records were screened by title and abstract, of which 20 were reviewed in full-text. One quantitative study and five qualitative studies were included, each of which describing a research capacity building framework. Three interconnected and interdependent themes were identified as being essential for research capacity building, including ‘supporting clinicians in research’, ‘working together’ and ‘valuing research for excellence’.

Conclusions

The findings of this systematic review have been synthesised to develop a succinct and integrated framework for research capacity building which is relevant for allied health professionals working in publicly funded secondary and tertiary healthcare organisations. This framework provides further evidence to suggest that research capacity building strategies are interlinked and interdependent and should be implemented as part of an integrated ‘whole of system’ approach, with commitment and support from all levels of leadership and management. Future directions for research include using behaviour change and knowledge translation theories to guide the implementation and evaluation of this new framework.

Trial registration

The protocol for this systematic review has been registered with PROSPERO. The registration number is CRD42018087476 .

Peer Review reports

There is a burgeoning interest in strategies to enhance research capacity building for healthcare professionals. The recent Strategic Review of Health and Medical Research in Australia (2013) recommended that research should be fundamentally embedded in the health system, and that the healthcare workforce should be involved in research to drive continuous improvement [ 1 ]. Research capacity building has been defined as “a process of developing sustainable abilities and skills enabling individuals and organisations to perform high quality research” [ 2 ], or “a process of individual and institutional development which leads to higher levels of skills and greater ability to perform useful research” [ 3 ].

While there is no single agreed upon definition of “allied health” in the international literature, allied health professions are commonly grouped together by exclusion from medical and nursing/midwifery, and include but are not limited to physiotherapy, occupational therapy, speech pathology, social work, psychology, podiatry and pharmacy [ 4 ]. The benefits of allied health professionals participating in research are manifold. At a clinician level, benefits include enhanced attitudes towards research [ 5 ], an increased uptake of research evidence into practice [ 6 , 7 ], and the development of critical thinking skills and a culture of evidence-based practice [ 8 ]. Clinicians who participate in research are also more likely to experience greater job satisfaction [ 9 , 10 ].

At a service level, having healthcare professionals involved in research may positively influence the infrastructure and processes of client care [ 11 ]. A sound base of high quality research evidence is needed to inform the delivery of evidence-based healthcare and strategic service planning and policy making [ 5 , 8 , 10 , 12 , 13 ]. An additional benefit is being able to evaluate and demonstrate the quality and efficiency of the healthcare services being provided [ 6 ]. This is especially a priority for the allied health workforce due to the relatively low level of evidence for many allied health interventions [ 8 , 10 , 14 ]. Allied health professionals need to produce research evidence to demonstrate the efficiency and cost-effectiveness of their interventions and models of service delivery, or else they will increase their vulnerability to having aspects of their work delegated to traditional medical and nursing professionals, not being able to maintain current roles, diversify into new areas or expand their scope of practice [ 6 , 8 ].

At a broader societal level, benefits of clinicians engaging in research include the potential of more successful translation and impact of research findings into clinical practice, thereby enhancing patient outcomes [ 15 , 16 , 17 ]. Indeed, having healthcare professionals involved in identifying research questions that arise from real-life problems and gaps in clinical practice and assisting with designing research methodologies may increase the likelihood that research projects will generate practical solutions which are readily translated into practice [ 17 ].

Previous research has demonstrated that allied health professionals are motivated to participate in research by intrinsic and extrinsic factors which align to these benefits. The most commonly reported motivators are to address problems in practice, build the evidence base to inform service delivery, provide the best possible care for patients and enhance their job satisfaction and career opportunities [ 6 , 10 , 18 , 19 ].

The aim of research capacity building in a healthcare setting is to strengthen health professionals’ existing clinical expertise with complementary research skills [ 8 ]. This enables them to contribute to the production of high-quality research which advances the knowledge base of their profession, demonstrates the effectiveness of interventions, influences funding bodies, and enables evidence-based practice [ 8 ]. Building research capacity may be targeted across three different levels including foundational skills in using research (e.g. understanding how to search for, appraise and consciously apply research evidence to inform practice), participating in research (e.g. assisting with participant recruitment and data collection) and leading research (e.g. developing research protocols and applying for funding).

Allied health professionals have been reported to have a high level of interest in undertaking research [ 20 , 21 , 22 ]. However, despite their interest and the recognised benefits, allied health research engagement remains limited due to a number of challenges and barriers including a lack of time and funding, other work roles taking priority, a lack of research skills and a lack of support from managers and colleagues [ 10 , 19 ]. As building allied health research capacity has been recognised as a priority [ 10 ], a range of different research capacity building approaches have been recommended and implemented across publicly funded healthcare organisations in Australia [ 9 , 23 ] and internationally [ 8 , 24 ].

Most of the extant literature describes single-strategy research capacity building initiatives, interventions or programs. Some of these strategies have been focussed at the level of individuals and teams, such as identifying those clinicians who express motivation and intention to do research and those who are seeking a challenge, improved job satisfaction or increased professional development opportunities [ 10 , 19 , 22 ] and providing these clinicians with protected time, education and training, resources and mentoring from more experienced researchers [ 10 , 18 , 22 , 25 , 26 , 27 ]. For example, a research internship model for podiatrists resulted in increased research output, as measured by the number of abstracts, publications and further research training [ 28 ].

Dedicated research leadership/facilitator or conjoint positions have been found to be associated with increased organisation and team domain scores on the Research Capacity and Culture tool, as well as increased research skills and outputs [ 7 , 29 , 30 ]. Similarly, academic-practice partnerships have been reported as an important strategy for increasing research capacity, engagement and output [ 10 , 27 , 31 , 32 ]. For example, a large proportion of research outputs by clinical staff within one large publicly funded health service were the result of work led by, or in collaboration with, academic partners [ 27 ].

Strategies which have been implemented at the level of the organisation include embedding research activities in strategic plans, visions, missions and values, developing targets or key performance indicators (KPIs) for research [ 19 ] and role descriptions to attract research interested and active applicants [ 10 ]. Organisation level strategies also include incorporating research into clinical roles, increasing funding for appropriate backfill of clinical positions, supporting staff with joint clinical and academic appointments [ 6 ] and creating opportunities to engage in research through secondment [ 6 , 8 , 12 , 27 ]. It has been suggested that organisations may benefit from strategically prioritising funding for those projects which have the greatest potential to directly impact on patient care [ 8 ].

Some authors have recognised that a single strategy approach is not sufficient, but that a “whole of organisation approach” or “whole of system approach” is required for building research capacity and culture in allied health [ 10 , 12 , 33 , 34 , 35 ]. A recent rapid review of allied health research frameworks has recommended multiple strategies across individual, organisational and policy levels to embed a culture of allied health research into healthcare services [ 36 ]. Authors have suggested that strategies are interlinked and interdependent, such that strategies implemented at one level can have an impact on other levels. Therefore, the use of coordinated and integrated multi-level strategies at individual, team, organisational and system levels has been recommended [ 18 , 25 , 33 , 37 ]. However, there currently is no single framework, model or set of recommendations to guide research capacity building approaches for allied health professionals in publicly funded secondary or tertiary healthcare settings.

The aim of this systematic review was to identify, appraise and synthesise existing models and frameworks which describe integrated and practical approaches to research capacity building for allied health professionals in publicly funded secondary or tertiary healthcare organisations. This review intended to search for both models and frameworks, the most common methods of conceptualising combinations of strategies. A model usually describes and guides the process of implementing an intervention, including a temporal sequence of steps, stages or phases of the process. In contrast, a framework usually identifies the hypothesised factors which may influence an outcome without describing the process for achieving this outcome. A framework may also provide a structure for planning and evaluating interventions. Neither models or frameworks necessarily address the causal mechanisms of change [ 38 ]. The protocol for this systematic review has been registered with PROSPERO. The registration number is CRD42018087476.

Search methods

In collaboration with authors AW and JM, a senior librarian developed a detailed search strategy in the following five electronic databases: Medline (Ovid), Embase (Elsevier), CINAHL (Ebsco), AustHealth (Informit) and Web of Science (Clarivate Analytics). Terms and synonyms relating to research capacity building, allied health, hospital and healthcare service/organisation, model and framework were used. Database searches were conducted on the 19th and 27th June 2017. An example of the search strategy used in Medline is found in Additional file  1 . The search terms were adapted as required to search the other four databases. Reference lists of included articles were additionally reviewed. Where full-text articles were not available, or clarification was required, one of the authors (JM) contacted the study authors to request the relevant information.

Study inclusion and exclusion criteria

The eligibility criteria for this study are described in Table  1 below. As the purpose of this systematic review was to address an identified need for evidence-informed allied health research capacity building approaches in a publicly funded secondary and tertiary healthcare organisation, the inclusion and exclusion criteria have been tightly scoped to reflect this. Only studies published in the English language and between January 2005 and June 2017 were included. These decisions were made in the interest of resourcing feasibility.

Study selection

Search results and additional references were collated into a reference database (Endnote) and any duplicates deleted. All titles and abstracts were independently screened by two authors to identify studies that potentially met the eligibility criteria. Full text copies of these articles were retrieved and independently assessed for eligibility by two authors. Disagreements were resolved by discussion and consensus agreement, and if required, input from a third author.

Data extraction and quality assessment

Data were independently extracted and analysed by two authors, using a data extraction form developed to include information pertaining to study location, participant demographics, purpose, definition of research capacity building, methodology and study design. Disagreements were resolved through discussion and consensus agreement.

The extent to which each study is likely to be influenced by bias was independently evaluated by two authors using the Mixed Methods Appraisal Tool (MMAT). This tool was designed to concomitantly appraise the methodological quality of studies with diverse designs including qualitative, quantitative and mixed methods research [ 39 ]. Two consistent screening criteria are complemented by four methodological criteria for each study design.

A total of 8492 unique records were assessed for eligibility by screening titles and abstracts. Of these, 20 were reviewed in full-text and six were included in the review [ 9 , 29 , 33 , 37 , 40 , 41 ]. Figure  1 illustrates the number of studies which were screened based on title/abstract and full-text, with reasons for exclusion documented.

figure 1

Flow diagram of process to identify eligible studies

A total of one quantitative and five qualitative studies were included. Studies originated in Australia ( n  = 4) and the UK ( n  = 2). All studies defined research capacity as the ability to engage in, perform or carry out quality research. All six studies met the definition of framework rather than model. The studies varied in terms of the composition of their frameworks and in the way that these had been developed, implemented and evaluated. Each framework describes number of research capacity building approaches. Refer to Table  2 for a description of the included studies.

Risk of bias within studies

All studies had a clear research question or objective and collected relevant data to address it. Studies varied in their methodology and in how comprehensively this was reported. Based on their MMAT scores, all six studies were judged to be of appropriate and comparable quality to be included in a narrative synthesis. Refer to Table  3 below for a descriptive summary of the methodological quality and risk of bias of each study using the MMAT criteria.

Data analysis

Qualitative analysis was used to synthesise findings. Initial steps of the qualitative analysis involved an attempt to directly compare the overarching research capacity building approaches described in each framework. The total number of approaches was 33, ranging from three to eight per framework. Please refer to Table 2 for details of these approaches. However due to differences in terminology and classification, it was not possible to compare these approaches directly. Due to variations in their purpose, content and theoretical design, no single framework was able to explain all of the approaches included in the others.

Instead, a content analysis method [ 42 ] was used to code and categorise the individual components of each approach (total number = 162), which were defined for the purpose of this review as the discrete strategies and conditions within each approach that were found to be conducive to research engagement and capacity building. These coded components were then grouped according to their frequency and emerging patterns of similarity and consistency in their content, both within and across the frameworks.

Next, an inductive thematic analysis was undertaken following the phases described by Braun & Clarke [ 43 ]. Phases included searching for underlying patterns of meaning among the coded components and groups of components, generating preliminary themes, reviewing the themes, and naming the themes [ 43 ]. This process was recursive and made use of thematic mind maps to explore relationships between the codes and themes. Each preliminary theme was reviewed to ensure that its included codes formed a coherent pattern. Some themes were consolidated while others were subdivided or reworked to ensure both internal homogeneity and external heterogeneity.

Ultimately, three interconnected and interdependent themes were identified as being essential for building research capacity. These are ‘supporting clinicians in research’, ‘working together’ and ‘valuing research for excellence’. These themes are supported by 17 subthemes. Two authors contributed independently to the analysis and met regularly to challenge each other’s assumptions and cross-check the validity of the preliminary and final themes to help maintain trustworthiness, credibility and accountability of the findings [ 44 ]. All authors agreed on the final themes. Please refer to Table  4 for an overview of the final themes and subthemes and to Additional file  2 for a detailed list of all coded and categorised components which are presented as lists of strategies linked to each subtheme.

Theme 1: supporting clinicians in research

Research capacity is built by supporting allied health professionals to develop research knowledge, skills and confidence. A range of strategies were documented in the literature and have been summarised into the following sub-themes:

relevant education and training for undertaking aspects of the research process such as writing grant and ethics applications;

opportunities to learn and apply skills in practice including assisting with collecting data for research projects, identifying research questions, leading small research projects and participating in journal clubs;

a research friendly workplace which accommodates and values individual clinicians’ research interests, motivations, abilities, time commitments and career paths;

mentoring and coaching from more experienced researchers;

access to resources including library, software, desk and computer use;

protected time and funding including support to apply for external research funding;

a system of reward and recognition through the provision of greater career opportunities, research career pathways and financial incentives;

support to undertake formal post-graduate study including higher degrees by research (HDR);

mix of clinicians with different levels of research skills within each team.

Theme 2: working together

Research capacity building is supported and enhanced when allied health professionals work with others in order to exchange ideas, knowledge, skills and resources and build a ‘critical mass’ of research-active staff. This may be achieved by developing:

strategic collaborations, partnerships, linkages and networks within and between teams, services and organisations including universities and industry;

shared purpose / drivers for research;

coordinated and team-based projects;

opportunities to share research expertise with others in the team and wider networks.

Theme 3: valuing research for excellence

To build research capacity in a healthcare setting, allied health professionals need to feel that their engagement in research is valued as contributing to excellent service delivery. This may be fostered by:

demonstrating visible support of and endorsement of research at the management level, including developing structured processes and systems for research and restructuring clinical roles to include some time for research;

prioritising research as part of a health service’s core business by including research in the service’s vision, mission, strategic plans, key performance indicators and role descriptions;

prioritising research projects which are close to / relevant to practice and in line with strategic priorities,

reporting, disseminating and applying locally developed research findings to inform practice.

The findings of this systematic review have been synthesised to develop a succinct and integrated framework for research capacity building which is relevant for allied health professionals working in publicly funded secondary and tertiary healthcare organisations. Three themes (‘supporting clinicians in research’, ‘working together’ and ‘valuing research for excellence) and 17 subthemes have been identified. Each subtheme is linked to a number of strategies which may be implemented at individual, team, organisational and policy levels as part of the ‘whole of system’ approach which has been recommended in the literature [ 12 , 33 , 36 , 45 ]. Although attempts were made to categorise strategies according to these structural levels, it was subsequently recognised that many strategies are applicable at more than one level. For example, for research to be considered part of core business, it needs to be valued by individual clinicians and by all levels of management across teams and the organisation and recognised within policy. This new framework consolidates many single-strategy research capacity building initiatives, interventions or programmes described in the literature, and provides further evidence to suggest that they are interlinked and interdependent and therefore benefit from being delivered in an integrated way to ensure maximum impact.

Although this review searched for both models and frameworks, only frameworks were found. It seems that frameworks are inherently better suited to guide research capacity building, because they do not include a clear linear process for how research capacity building interventions should be implemented. A number of factors appear to influence the outcomes of research capacity, culture and engagement and are useful for guiding the design and evaluation of interventions. However, the way in which interventions are implemented is highly dependent on context, such as the specific strengths, weaknesses, interests, needs and priorities of each individual, team and organisation.

A fundamental concept which was identified across all three themes is the importance of commitment and multi-faceted support from all levels of leadership and management. A research culture has been described as “an environment within an organisation that enables and supports research to generate new knowledge and opportunities to translate evidence into practice” [ 18 ] and has been reported to be essential for building research capacity [ 19 , 33 ]. Previous studies have found that senior management and leadership support for research appears to have a significant impact on an organisation’s research culture [ 7 , 20 , 35 , 36 , 46 ] and individual health professionals’ engagement in research [ 29 , 31 ]. The findings of this review further emphasise that in order to build and sustain research engagement, leaders and managers should recognise the benefits of having research-active practitioners in the workforce and consider research to be part of their core business alongside clinical practice [ 8 , 19 , 27 ]. Another implication is the importance of investing in collaborations with internal and external partners, mentors and colleagues who can support clinicians to undertake research within their existing roles, which is consistent with previous recommendations in the literature [ 22 , 25 , 32 ].

Limitations

As the purpose of this systematic review was to inform a broader research capacity building project being conducted in a large publicly funded secondary and tertiary healthcare organisation, a decision was made to tightly scope the search strategy and eligibility criteria to maximise relevance to our context. A limitation of this decision is that the results may not be transferable to other contexts.

Overall, there is a paucity of published evidence-informed research capacity building models and frameworks which are suitable for allied health. Moreover, the extant literature about research capacity building is poorly indexed using variable search terms. For example, different terms and definitions are used to describe models and frameworks. As a result, it was challenging to construct a search strategy which captured all relevant articles. There is a need for a better taxonomy of terms relating to research capacity building to assist with indexing, searching and identifying relevant articles.

Another limitation was that the term ‘primary care’ is inconsistently used in the literature. Although this term usually refers to settings where clinicians work independently and have first contact with clients, through hand searching of the literature, we have found three articles which use the term ‘primary care’ but refer to a population which meets this study’s criteria of secondary care. Therefore, it is possible that other studies have been missed because they were not captured by the search strategy.

This systematic review developed a succinct and integrated framework for allied health research capacity building. This framework may be used to inform and guide the design and evaluation of research capacity building strategies targeting individuals, teams, organisations and systems. This framework provides structure in terms of specific strategies which can be monitored using process and outcome measures to determine short- and long-term impacts. Future directions for research include using behaviour change and knowledge translation theories to guide the implementation and evaluation of this framework. Another opportunity is to evaluate the transferability of this framework to other healthcare professions and settings.

Abbreviations

Mixed Methods Appraisal Tool

McKeon S, Alexander E, Brodaty H, Ferris B, Frazer I, Little M. Strategic review of health and medical research in Australia–better health through research. In: Canberra: commonwealth of Australia; 2013. p. 1–304.

Google Scholar  

Holden L, Pager S, Golenko X, Ware RS. Validation of the research capacity and culture (RCC) tool: measuring RCC at individual, team and organisation levels. Aust J Prim Health. 2012;18(1):62–7.

Article   Google Scholar  

Trostle J. Research capacity building in international health: definitions, evaluations and strategies for success. Soc Sci Med. 1992;35(11):1321–4.

Article   CAS   Google Scholar  

Turnbull C, Grimmer-Somers K, Kumar S, May E, Law D, Ashworth E. Allied, scientific and complementary health professionals: a new model for Australian allied health. Aust Health Rev. 2009;33(1):27–37.

Lizarondo L, Grimmer-Somers K, Kumar S. A systematic review of the individual determinants of research evidence use in allied health. J Multidiscip Healthc. 2011;4:261–72.

Skinner EH, Williams CM, Haines TP. Embedding research culture and productivity in hospital physiotherapy departments: challenges and opportunities. Aust Health Rev. 2015;39(3):312–4.

Williams C, Miyazaki K, Borkowski D, McKinstry C, Cotchet M, Haines T. Research capacity and culture of the Victorian public health allied health workforce is influenced by key research support staff and location. Aust Health Rev. 2015;39(3):303–11.

Pickstone C, Nancarrow S, Cooke J, Vernon W, Mountain G, Boyce R. Building research capacity in the allied health professions. Evid Policy. 2008;4(1):53–68. https://doi.org/10.1332/174426408783477864 .

Hulcombe J, Sturgess J, Souvlis T, Fitzgerald C. An approach to building research capacity for health practitioners in a public health environment: an organisational perspective. Aust Health Rev. 2014;38(3):252–8.

Pager S, Holden L, Golenko X. Motivators, enablers, and barriers to building allied health research capacity. J Multidiscip Healthc. 2012;5(53):e9.

Hanney S, Boaz A, Jones T, Soper B. Engagement in research: an innovative three stage review of the benefits for health-care performance. Health Serv Deliv Res. 2013;1(8). https://doi.org/10.3310/hsdr01080

Cooke J. A framework to evaluate research capacity building in health care. BMC Fam Pract. 2005;6:44.

Stewart D, Al Hail M, Abdul Rouf PV, El Kassem W, Diack L, Thomas B, Awaisu A. Building hospital pharmacy practice research capacity in Qatar: a cross-sectional survey of hospital pharmacists. Int J Clin Pharm. 2015;37(3):511–21.

Ried K, Farmer EA, Weston KM. Bursaries, writing grants and fellowships: a strategy to develop research capacity in primary health care. BMC Fam Pract. 2007;8(1):19.

Blevins D, Farmer MS, Edlund C, Sullivan G, Kirchner JE. Collaborative research between clinicians and researchers: a multiple case study of implementation. Implement Sci. 2010;5(1):76.

Bornmann L. What is societal impact of research and how can it be assessed? A literature survey. J Am Soc Inf Sci Technol. 2013;64(2):217–33.

Misso ML, Ilic D, Haines TP, Hutchinson AM, East CE, Teede HJ. Development, implementation and evaluation of a clinical research engagement and leadership capacity building program in a large Australian health care service. BMC Med Educ. 2016;16(1):13.

Alison JA, Zafiropoulos B, Heard R. Key factors influencing allied health research capacity in a large Australian metropolitan health district. J Multidiscip Healthc. 2017;10:277–91.

Borkowski D, McKinstry C, Cotchett M, Williams C, Haines T. Research culture in allied health: a systematic review. Aust J Prim Health. 2016;22(4):294–303.

Lazzarini PA, Geraghty J, Kinnear EM, Butterworth M, Ward D. Research capacity and culture in podiatry: early observations within Queensland health. J Foot Ankle Res. 2013;6(1):1.

Pighills AC, Plummer D, Harvey D, Pain T. Positioning occupational therapy as a discipline on the research continuum: results of a cross-sectional survey of research experience. Aust Occup Ther J. 2013;60(4):241–51.

Harvey D, Plummer D, Nielsen I, Adams R, Pain T. Becoming a clinician researcher in allied health. Aust Health Rev. 2016;40(5):562–9.

Hiscock H, Ledgerwood K, Danchin M, Ekinci E, Johnson E, Wilson A. Clinical research potential in Victorian hospitals: the Victorian clinician researcher needs analysis survey. Intern Med J. 2014;44(5):477–82.

Atkin H, Jones D, Smith K, Welch A, Dawson P, Hargreaves G. Research and development capacity building in allied health: rhetoric and reality. Int J Ther Rehabil. 2007;14(4):162–6.

Harding KE, Stephens D, Taylor NF, Chu E, Wilby A. Development and evaluation of an allied health research training scheme. J Allied Health. 2010;39(4):142–8.

Cotter JJ, Welleford EA, Vesley-Massey K, Thurston MO. Town and gown: collaborative community-based research and innovation. Fam Community Health. 2003;26(4):329–37.

Marshall AP, Roberts S, Baker MJ, Keijzers G, Young J, Stapelberg NC, Crilly J. Survey of research activity among multidisciplinary health professionals. Aust Health Rev. 2016;40(6):667–73.

Naidoo S, Bowen C, Arden N, Redmond A. Training the next generation of clinical researchers: evaluation of a graduate podiatrist research internship in rheumatology. J Foot Ankle Res. 2013;6(1):15.

Cooke J, Nancarrow S, Dyas J, Williams M. An evaluation of the ‘designated research team’ approach to building research capacity in primary care. BMC Fam Pract. 2008;9:37.

Wenke RJ, Mickan S, Bisset L. A cross sectional observational study of research activity of allied health teams: is there a link with self-reported success, motivators and barriers to undertaking research? BMC Health Serv Res. 2017;17(1):114.

Perry L, Grange A, Heyman B, Noble P. Stakeholders’ perceptions of a research capacity development project for nurses, midwives and allied health professionals. J Nurs Manag. 2008;16(3):315–26.

Joubert L, Hocking A. Academic practitioner partnerships: a model for collaborative practice research in social work. Aust Soc Work. 2015;68(3):352–63.

Golenko X, Pager S, Holden L. A thematic analysis of the role of the organisation in building allied health research capacity: a senior managers’ perspective. BMC Health Serv Res. 2012;12(1):276.

Lavis JN. Research, public policymaking, and knowledge-translation processes: Canadian efforts to build bridges. J Contin Educ Health Prof. 2006;26(1):37–45.

Williams CM, Lazzarini PA. The research capacity and culture of Australian podiatrists. J Foot Ankle Res. 2015;8(1):11.

Slade SC, Philip K, Morris ME. Frameworks for embedding a research culture in allied health practice: a rapid review. Health Res Policy Syst. 2018;16(1):29.

Holden L, Pager S, Golenko X, Ware RS, Weare R. Evaluating a team-based approach to research capacity building using a matched-pairs study design. BMC Fam Pract. 2012;13(1):16.

Nilsen P. Making sense of implementation theories , models and frameworks. ImplementSci. 2015;10(1):53.

Souto RQ, Khanassov V, Hong QN, Bush PL, Vedel I, Pluye P. Systematic mixed studies reviews: updating results on the reliability and efficiency of the mixed methods appraisal tool. Int J Nurs Stud. 2015;52(1):500–1.

Bamberg J, Perlesz A, McKenzie P, Read S. Utilising implementation science in building research and evaluation capacity in community health. Aust J Prim Health. 2010;16(4):276–83.

Whitworth A, Haining S, Stringer H. Enhancing research capacity across healthcare and higher education sectors: development and evaluation of an integrated model. BMC Health Serv Res. 2012;12(1):287.

Elo S, Kyngäs H. The qualitative content analysis process. J Adv Nurs. 2008;62(1):107–15.

Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101.

Finlay L. Negotiating the swamp: the opportunity and challenge of reflexivity in research practice. Qual Res. 2002;2(2):209–30.

Farmer E, Weston K. A conceptual model for capacity building in Australian primary health care research. Aust Fam Physician. 2002;31(12):1139.

PubMed   Google Scholar  

Pain T, Plummer D, Pighills A, Harvey D. Comparison of research experience and support needs of rural versus regional allied health professionals. Aust J Rural Health. 2015;23(5):277–85.

Condell S, Begley C. Capacity building: a concept analysis of the term applied to research. Int J Nurs Pract. 2007;13(5):268–275.

Download references

Acknowledgements

Sarah Thorning, Senior Librarian, Gold Coast Health.

This project did not receive any (external competitive) funding. In-kind funding was provided by the Gold Coast Hospital and Health Service to pay for the project officer secondment of a permanently employed clinician.

Availability of data and materials

Search strategy and a detailed list of all coded components mapped against the subthemes are included in the additional files. All other pertinent data is included in the final manuscript. Further data regarding excluded studies is available on request from the authors.

Author information

Authors and affiliations.

Allied Health, Gold Coast Health, Gold Coast, Queensland, Australia

Janine Matus, Ashlea Walker & Sharon Mickan

School of Allied Health Sciences, Griffith University, Gold Coast, Queensland, Australia

Sharon Mickan

You can also search for this author in PubMed   Google Scholar

Contributions

AW and JM completed the abstract screening. JM, AW and SM completed the full-text screening. JM and SM completed the data extraction. JM and SM led the writing of the introduction, methods, results and discussion in the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Janine Matus .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

Competing interests.

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:.

Search strategy. (DOC 25 kb)

Additional file 2:

List of coded components mapped against themes and subthemes. (DOCX 34 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Cite this article.

Matus, J., Walker, A. & Mickan, S. Research capacity building frameworks for allied health professionals – a systematic review. BMC Health Serv Res 18 , 716 (2018). https://doi.org/10.1186/s12913-018-3518-7

Download citation

Received : 11 April 2018

Accepted : 30 August 2018

Published : 15 September 2018

DOI : https://doi.org/10.1186/s12913-018-3518-7

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Research capacity building
  • Research culture
  • Research activity
  • Allied health

BMC Health Services Research

ISSN: 1472-6963

research capability

Advertisement

Advertisement

Development of research capacity of a future social pedagogue in the face of digital technologies

  • Published: 29 January 2022
  • Volume 27 , pages 6947–6966, ( 2022 )

Cite this article

  • Rysbek Maussumbayev 1 ,
  • Rymshash Toleubekova 1 ,
  • Karas Kaziyev 2 ,
  • Axaule Baibaktina 3 &
  • Altynshash Bekbauova 3  

2348 Accesses

3 Citations

Explore all metrics

The aim of this article is to theoretically and practically justify the development of a future social pedagogue’s research capacity in the context of digital technologies. Practical implementation of a model for building social pedagogues’ research capacity within the framework of an online educational course can provide effective training of pedagogues in a digitalized educational environment and ensure a high level of pedagogical excellence. Approbation of the model was carried out on bachelor students majoring in Social Pedagogy and Self-Knowledge and taking an online Scientific Research Organization and Planning course in the Pavlodar Pedagogical University (Kazakhstan). The course was held from September to December 2020 on the Moodle learning platform, one of the most popular for the purposes of e-learning. The communication part of the course was supported by the social networking features of Facebook. In total, the educational experiment encompassed 52 students. Their thorough examination enabled the conclusion that research capacity brings proficient results in creative, innovative, communication, leadership, pedagogical, and digital activities, and thus remains an important component of pedagogical excellence. Thus, it can be stated that the integration of digital tools and platforms, along with the creation of multimedia learning content and the introduction of research methods and structured research practice into the students’ education, allows for an effective research learning practice. The results of the students’ end-of-course survey identified factors that had a significant influence on the effectiveness of research activities in online learning, namely the quality of the digital platform, structured research practice, active learning opportunities, pedagogical excellence, effective communication, and personalization.

Similar content being viewed by others

research capability

Exploring the role of social media in collaborative learning the new domain of learning

Jamal Abdul Nasir Ansari & Nawab Ali Khan

research capability

Social Constructivism—Jerome Bruner

Research design: the methodology for interdisciplinary research framework.

Hilde Tobi & Jarl K. Kampen

Avoid common mistakes on your manuscript.

1 Introduction

The education of the 21st century is witnessing a gradual shift of learning activities from a traditional teacher-centered approach to a more constructivist one oriented toward students. Under such conditions, knowledge is actively constructed through meaningful learning experiences gained in self-directed and personalized learning practices enabled by digital technologies and pedagogical innovations (Sailin & Mahmor, 2018 ). Modern digital technologies open up excellent opportunities for ensuring constructive social interaction (Greenhow & Askari, 2017 ), improving collective learning, offering more learning flexibility and personalization, and making it more student-centered (Zidoun et al., 2019 ). Correspondingly, their application may become a useful aid for individuals designing the educational process of the 21st century.

In this day and age, students’ learning and digital pedagogy are inextricably linked and driven by such characteristics as efficiency, quality, intensity, personalization, and adaptation. The purpose of the digital transformation of education and digital pedagogy, in particular, is to create non-standard algorithms for solving traditional pedagogical tasks as well as forming and developing an innovative learning process based on digital intelligence, big data, and distributed computing. These innovations effectively contribute to the development of individual learning trajectory, implementation of adaptive learning systems and algorithms, maintenance of digital records of students’ progress, and development of dedicated systems allowing one to assess and control students’ progress (with detailing and specialization of different levels) and measure the degree of development of student’s necessary competencies (Toktarova & Semenova, 2020 ). Modern information and communication technologies have enormous potential. The two-way communication allows each learning process participant to actively work, interact, and cooperate with others (Velichová et al., 2020 ). The integration of digital technologies into the learning process can support active and meaningful learning by presenting opportunities for authentic learning through exploratory experimentation or experiential learning (Sailin & Mahmor, 2017 ).

In recent times, research capabilities have attracted burning interest among academics and practitioners. At the core, research capability refers to an ability to conduct good-quality research in a professional field (Caingcoy, 2020 ). Research is an important tool for national and global progress (Tamban & Maningas, 2020 ). And the level of education of people involved in such intellectual work as well as their skills and general qualifications, play a critical role here. The public knowledge involved in technological, information, and communication systems forms the national intellectual capacity conditionally divided into realized and unrealized. As time passes, the realized capacity is converted into intellectual capital through a transfer to an intellectual product created by professionals and is recognized as part of intellectual property. The capacity that exists in an unrealized form is the body of knowledge and skills of professionals not involved in the production process. The development of this capacity is influenced by the educational system (Miethlich et al., 2020 ). Developments in science and technology are posing new socio-economic challenges that are becoming increasingly complex and diverse. Consequently, consciously or unconsciously, everyone is required to be more creative in dealing with the various rapidly evolving life problems. Higher education should foster the comprehensive development of scientific knowledge, skills, and creativity of students during the learning to develop the ability to critically analyze real-world problems and find possible creative and innovative solutions (Zainuddin et al., 2020 ).

Research is widely recognized as a key element of professional training for both future social pedagogues and their instructors. Yet, despite its importance, research in pedagogical education is often criticized and challenged (Murray & Vanassche, 2019 ). Pedagogical training usually represents an articulated discourse about the profession of an educator with a relatively weak and adapted research base. Though, it cannot be regarded as research-based only because of the fact that future social pedagogues write an evidence-based student thesis. Instead, their whole training needs to be developed within the concept of research-based education, taking into account the course content, its research base, and the current learning models. This approach will allow future pedagogues to reflect on and be thoughtful about the demands of an ever-changing environment and adapt easily to innovative shifts in the educational system (Alvunger & Wahlström, 2018 ).

An essential part of pedagogical excellence is research capacity, which is the ability to successfully achieve the goals of research activities in the learning process. Such capacity can be developed through the accumulation of research experience (Manongsong & Panopio, 2018 ), which requires applying acquired knowledge and skills to produce first-grade results. Research capacity-building actions should be seen as a directed form of professional development (or learning) for pedagogues as independent practitioners and professional community members (Murray & Vanassche, 2019 ). Increasing the research capacity of educators implies a change in their professional profiles, including a change in the working repertoire of knowledge that enables a person to perform prior professional tasks (Griffioen, 2020 ).

Improving the capacity of educators to meet the demands of educational innovation is both a goal and a motivation and is referred to as one of the most crucial success factors in the renewal of the educational system. Therefore, it is necessary to focus on educating social pedagogues in the following areas: professional ethics; learning capacity; program development capacity; research capacity; social engagement capacity; ability to provide education, science, and technology transfer services; capacity for international cooperation in higher education with an emphasis on the ability to elaborate teacher training and general education programs directed on the development of students’ integrated pedagogical competence while combining theory and practice (Phunga et al., 2020 ).

This study aimed to theoretically and practically justify the need for the development of the research capacity of a future social pedagogue in the context of digitalization.

For this goal achievement, the following objectives are to be attained:

determine the theoretical foundations for the development of the research capacity of a future social pedagogue in the context of e-learning;

work out a model for developing the research capacity of a future social pedagogue within the structure of an online course;

test the model for developing the research capacity of a social pedagogue in an online environment;

define the factors influencing the effectiveness of research activities of future social pedagogues in the process of e-learning.

2 Materials and methods

2.1 theoretical framework, 2.1.1 the structure of the social pedagogue’s capacity.

The nature of knowing and learning is the ground for understanding how learning communities behave and create new knowledge. To date, numerous theoretical foundations of knowledge construction and research capacity building have been created, including social learning, problem-based learning, situated learning, and knowledge management (Robinson et al., 2020 ). By and large, learning is a collaborative effort directed toward solving problems related to the actual design, assigning independent repetitive tasks, and creating a social environment by means of group interaction under the constructivist approach. Constructivism, in turn, is a student-oriented way of education that underlines that students construct their knowledge by linking new information with the existing one. According to constructivist learning theory, learning is defined as the creation of relationships between new knowledge and prior experience. Students meet their learning needs, achieve their learning goals, solve problems under the guidance of an educator using different resources and tools, support each other as a group (Korucu & Atun, 2017 ), and hence, accumulate personal pedagogical capacity.

The problem of teacher education and training primarily concerns the relationship between professional and pedagogical competence. One of the important elements on which educators build their professional identity is their pedagogical role in the learning process. The teacher’s role is never unambiguously defined but influenced by many internal and external factors. Internal factors influence the social pedagogue’s perception of themselves as a participant in the learning process. In parallel, external ones incorporate attitudes and expectations about the pedagogue’s role that arise from students, parents, colleagues, academic administration, and the public. Both types of factors are weighty for the teacher’s professional identity. The internal factors that influence the understanding of the social pedagogue’s role are created by pedagogues themselves and can be divided into two categories: educator’s beliefs about what role is important and educator’s expectations about their role (Makovec, 2018 ). In addition to the aforementioned factors, there are also individual psychological characteristics such as research orientation or preference for research, extrinsic and intrinsic motivation, research confidence or self-efficacy, and a desire for achievement and recognition, which have a substantial impact on teaching quality (Heng et al., 2020 ).

Pedagogical capacity is a type of competence summarizing the elements which have an integral relationship with each other, including knowledge, skills, personal qualities, methods, manners, and qualities of a pedagogue, and constituting internal capabilities and conditions which ensure a high level of pedagogical work (Phunga et al., 2020 ). The category ‘capacity’ should be considered as a set of tools, foundations, and sources used for certain purposes. It is related to the level of the ability to perform a certain action or function. In humanities and social pedagogy, the term ‘capacity’ typically refers to human resources, reserves, or capabilities. Figure  1 presents the structure of the pedagogical capacity of a future social pedagogue.

figure 1

( Source : Developed by the author)

The structure of a social pedagogue’s capacity.

The complex capacity of a social pedagogue consists of seven interrelated elements that determine the pedagogical excellence of an educator, including innovative, intellectual, creative, communicative, emotional-moral, research, and professional-pedagogical capacity. The elements of pedagogical capacity are formed and developed in synergy and interdependence.

2.1.2 The role of digital technologies in the system of development of professional competencies of future social pedagogues

The modern stage of socio-economic development of the world is characterized by the movement toward a digital society through the development of the digital economy, which implies the transformation of the entire system of production and service provision with the help of information and communication technologies (Toktarova & Semenova, 2020 ). Digital technologies have changed the content of many common concepts, such as government, commerce, democracy, and learning, thereby providing the public with new concepts created through the prefix ‘e’ (Sivalingam et al., 2018 ). Special attention should also be paid to the problems of developing digital literacy in technical, vocational, and higher education (Narikbaeva & Savenkov, 2016 ).

At the same time, the increasing penetration of internet services, and especially social media, has grammatically improved the digital literacy of people. Cloud computing, the Internet of Things, virtual reality, interactive touch screens, 3D printing, e-learning solutions, open-source mobile applications, and high-speed communications have made digital technologies interactive and user-friendly (Ahmad et al., 2020 ). As a result, in the 21st century, digital technology has become a major factor in economic development. Particularly this stimulated the Republic of Kazakhstan to introduce the Digital Kazakhstan program on 12 December 2017, the central aims of which were to accelerate the development of the Republic’s economy, improve the quality of life of the population in the short and long terms, and provide a development trajectory for the economy of Kazakhstan to develop digital technologies further.

The digital transformation of education systems at all levels has allowed incorporating a new teaching-learning ecosystem called e-learning. The concept of e-learning is a technology-mediated learning approach of great potential from the educational perspective (Valverde-Berrocoso et al., 2020 ). It is widely used by educational institutions to support the learning process and provide students with access to educational materials at any time (Selviandro & Hasibuan, 2013 ). The integration of technologies into the learning process has led to the need to disseminate digital learning methods in teacher training environments. E-learning uses different technological tools to access and provide information over the internet, making teaching and learning more effective by improving the quality of interaction and helping educators and students achieve educational goals. As of today, there is a wide range of e-learning tools providing students with the opportunity to create their digital environment and learn according to their personal needs (Gupta, 2019 ).

The development of e-learning has led to the emergence of innovative features aimed at helping educators and students in learning. The cloud-based online services segment has become extremely appealing for various educational institutions due to such useful features as geographical distribution, cost-effectiveness of automated systems, and associated open-source software (Hussein & Hilmi, 2020 ). The educator in e-learning becomes a leader of collective knowledge creation supported by a learning portal based on cloud computing and artificial intelligence. This means that the educator’s position is to lead a forum for collaborative learning and sharing, using the cloud as a resource provider (Hendradi et al., 2019 ). By strengthening the connection between students and educators, cloud-based education services make it possible to create live and interactive learning platforms, work in groups, and implement more effective and innovative teaching methods (Naveed & Ahmad, 2019 ). At present, cloud computing technologies are mainly represented by money-saving services allowing high-quality training. Besides, knowledge management systems that rely on technological infrastructure are viable for continuous knowledge accumulation, processing, and transfer (Liu et al., 2020 ).

Currently, the main objective of pedagogical education is to train techno-educators who are capable of developing and implementing digital pedagogy. Educators should be able to integrate technology into instruction by understanding their role in technology-oriented classrooms and developing skills to use web-based technologies in practice (Dangwal & Srivastava, 2016 ). Therefore, it is predicted that a new generation of pedagogical methodologies and teaching and learning approaches will focus predominantly on the individuality of students, and social media will be a central means of the personalization process. Integrating social media into the institutional structures of education is expected to create individualized and easily adaptable academic pathways toward meeting the needs of each student. By maintaining a clear structure of curricular-based learning goals and combining them with new ways of digital control of the educational process and learning experience will enable improving the quality and fruitfulness of education and, at the same time, positively influence students’ motivation (Cunha et al., 2020 ).

2.2 Research design and sampling

Given the relevance of research activities in the process of training future social pedagogues, a model for building social pedagogues’ research capacity within the framework of an online educational course was developed (Fig.  2 ).

figure 2

 A model for building research capacity of a future social pedagogue within the framework of an online course.

Organizing high-quality and effective research activities for students in an e-learning format presupposed the four steps to be taken:

integration of digital tools and platforms into the structure of learning activities;

development of multimedia learning content;

development of a structured research practice;

introduction of research methods into the student’s educational activities.

In turn, students’ research activities in an online environment were carried out in the following sequence of steps:

setting personal goals and objectives of the research practice;

studying the theoretical educational material;

gradual implementation of the research part of the learning activity with clear guidance and instruction;

development of research projects and their testing in the online environment;

The development of future social pedagogues’ research capacity takes place in parallel with the acquisition of research experience, which is formed by first-grade results in creative, innovation, communication, leadership, pedagogical, and digital activities. Practical implementation of the model for building social pedagogues’ research capacity within an online educational course is believed to assure high-level preparation of future pedagogues in the digitalized educational environment and contribute to the development of their pedagogical excellence.

Approbation of the model was carried out on the example of bachelor students majoring in Social Pedagogy and Self-Knowledge and taking the Scientific Research Organization and Planning course in the Pavlodar Pedagogical University (Kazakhstan). Overall, the experimental training enrolled 52 individuals who gave their consent for the personal data processing and survey results publication. The structure of the study group is presented in Table  1 .

The Scientific Research Organization and Planning course took place from September to December 2020 on Moodle, one of the most popular platforms for e-learning. The communication part of the course was carried out by means of the social networking features of Facebook. In order to identify the factors that exert the most significant influence on the effectiveness of research activities of future social pedagogues in e-learning, it was decided to end the course with a joint online discussion.

Experiment participants’ surveys were developed and carried out using the capabilities of the Survio online platform ( 2021 ). This particular choice can be explained by the fact that the platform provides an opportunity to create questionnaires and share them with other people via Facebook. The creation of forms to be filled was done using available templates from the Survio database.

2.3 Research limitations

Since the educational experiment was carried out within the Scientific Research Organization and Planning course, the thematic purpose of which was to instill research skills in future social pedagogues, the proposed model was limited to the course concept. In view of this, the effectiveness of the model for developing students’ research capacity in online learning should be further explored within the framework of a comprehensive teacher development program.

At the end of the educational course, students were asked to complete a specially prepared questionnaire to assess the individual performance of the participants in their research activity and the factors that affect the quality of the course incorporating a comprehensive model for building research capacity in online settings (Table  2 ).

Through an online discussion on Facebook and student feedback on e-learning experience, the course moderators identified the factors that influence the effectiveness of research activities in the digital environment. Hence, it was unveiled that the top-important determinants in developing future social pedagogues’ research capacity and acquiring pedagogical excellence in an e-learning environment are the quality of the digital platform, structured research practices, active learning opportunities, and pedagogical excellence, effective communication, and personalization (Fig.  3 ).

figure 3

Factors influencing the effectiveness of research activities of future social pedagogues in the e-learning process.

The quality of a digital platform is determined by the ease of its use, the effectiveness of educational content delivery, support permanence and availability, and flexibility to connect students to mixed/synchronous modalities. Following the experiment results, most students (82%) confirmed that the digital learning interactions on Moodle increased their engagement, motivation, and persistence toward their learning goals, as well as had an influence on the development and retention of their digital literacy skills (87%). The ease of the online platform and the set of proposed digital tools was proved to positively affect respondents’ perception of learning activities (73%) and level of intrinsic satisfaction and learning motivation (85%). Moreover, it was revealed that learning activities in a digital learning environment allowed students to reduce cognitive load and focus on real learning tasks, which was confirmed by 82% of the surveyed.

Structured research practice is defined by the step-by-step implementation of learning activities with clear guidance and instruction embedded in the learning content (video, audio, textual instructions, etc.). Professional pedagogical education should have direct linkages to the everyday practice of teaching, a fundamental element of which is research. In this respect, 85% of students indicated that clear models of research concepts as well as strategies and ideas presented in the form of media guidelines and instructions allow for a comprehensive understanding of research work in the practice of the modern social pedagogue.

Active learning is based on the mechanisms of active interaction with content and student-teacher cooperation. It opens broad opportunities for demonstrating key research competencies in the student’s practice. Furthermore, as evidenced by 94% of respondents, active interaction strategies increase research persistence and productivity. The Scientific Research Organization and Planning course provided students with opportunities to apply theoretical knowledge in practice making the learning process more meaningful. In addition, students had the possibility to practice teamwork skills as researchers and organizers.

Pedagogical excellence is determined by the capacity for sustained research and educational activities under the curriculum, work with feedbacks, and objective assessment of the results. Thus, for the learning to be productive, it should be grounded on the pursuit of professional excellence through research practice, fair assessment, and comments provision. For 79% of students, opportunities for conscious practice or individual training outlined personal vectors for pedagogical excellence. For 73% of them, the instructors’ innovative and creative approach had a significant influence on the uptake of pedagogical skills and engaged in the search for effective tools for learning interaction. Finally, 89% of respondents indicated that quality and prompt feedback, which supports student reflection and provides an objective assessment of their pedagogical skills’ mastery, is an important characteristic of productive learning.

Effective communication is crucial for the achievement of quality research outcomes. It is the connection with an expert/educator, collaboration with colleagues, and digital opportunities that sustain social presence. However, the conducted investigation showed that professional education in the field of social pedagogy becomes even more successful in a digital environment. Online conditions that promote peer learning and interaction are associated with higher levels of student comfort (95%), learning flexibility (79%), and connectivity (98%). And 93% of students confirmed that instructor-led learning is an essential and effective e-learning component.

As for the personalization in a learning practice, it allows students to activate their experience and knowledge, implement research activities according to personally set goals and objectives, and rely on individualized support from educators. The present study unveiled that personalization through e-learning platforms offers substantial opportunities for professional development consistent with the needs and preferences of a contemporary student (91%). In addition, personalization incorporated into the concept of an educational course was confirmed to focus on students’ personal aims, allowing for a personalized pedagogical development and research skills improvement plan (89%).

Table  3 below explicates the results of students’ research activity progress self-assessment that was proposed for them to perform after the experiment ended. For this, respondents were asked to rank their digital, creative, innovation, communication, leadership, and pedagogical experiences.

4 Discussion

Research capacity developed through the accumulation of research experience is of great value for pedagogical excellence. As evidenced by the present study, investigation-directed activities incorporated into the educational course have a significant effect on the integrated capacity of a social pedagogue, which consists of seven interrelated elements: innovative, intellectual, creative, communicative, emotional and moral, research, and professor-pedagogical. All of these elements are forming and developing in synergy and interrelation. Even research capacity building of an educator occurs in parallel with the acquisition of research experience, which acts upon creative, innovative, communication, leadership, pedagogical, and digital activities.

There are many approaches to pedagogical education, one of which is based on the notion that the knowledge base of the curriculum is dynamic and that student-teachers are active knowledge processors. Research-based thinking is seen as a linking factor in this process. Identifying pedagogical elements and asking pedagogically relevant questions in learning situations are only few of the paramount skills a future practitioner needs. Key factors determining research-based pedagogical education incorporate the following points. First of all, the curriculum is structured according to the systemic educational structure. Secondly, all learning and teaching are research-based. Thirdly, activities are organized in such a way as to give students the opportunity to practice argumentation, decision making, and justification when inquiring and solving pedagogical problems. Lastly, students learn different research skills during their studies (Toom et al., 2008 ).

Concurrently, research-based learning is a multi-faceted concept that tightly links research and teaching. Accordingly, a research-based teaching practice should ensure that the obtained research findings influence the curriculum, teaching and learning methods are grounded on the research, and educators benefit from research elements while teaching (Yulhendri et al., 2018 ). Research-based education deals with analysis, synthesis, and evaluation activities and enables students and educators to improve the absorption and application of knowledge. More and more scholars indicate that research is an essential means of enhancing learning quality, and research-based learning is conducted under constructivism covering four aspects: learning which constructs students’ understanding, learning through developing prior knowledge, learning which involves social interaction process, and meaningful learning, which is achieved through real-world experience (Susiani et al., 2017 ).

Current generations of students have grown up with information technology, and this influences learning strategies dramatically as technology is more and more often perceived as a fundamental element of academic success. Students note that technologies such as smartphones, social media, and educational platforms are critical social pedagogical tools for solving research problems (Ma & Au, 2014 ).

Pirozhkova ( 2021 ) argues that research-based learning is the process of acquiring knowledge through analysis, experiments, and data interpretation through scientific analysis methods. The results of her study on the impact of a research-oriented educational approach at the Ural State University of Economics confirms that the integration of research into teaching various disciplines can bring the following advantages:

for students—development of critical professional and general competencies and a better understanding of science and professional subjects motivating them for successful learning.

for faculty—better students’ performance and involvement in classroom work;

for the university—publication of students’ works in journals and conference proceedings (Pirozhkova, 2021 ).

A research-based approach to teaching may be more successful than the traditional when it comes to linking theory and practice using real-life case studies and contemporary social issues. The development of students’ research capacity occurs in the process of active educational activity together with the mastery of reflective knowledge and critical thinking to build their personal vision of research problems. Research-based teaching should address several aspects such as research design, data collection, practical research, and interpretation of results. It should focus on actual world challenges to put students in a situation where they not only apply scientific methods but also increase their commitment and participation in the process (Espinoza-Figueroa et al., 2021 ).

Affiliation of students with a project research team can increase student motivation to participate in online learning activities by providing frequent, meaningful, and motivating interactions to achieve shared results in a virtual space. A study conducted among students of the Danish problem-Based Learning University during the adaptation to educational conditions caused by COVID-19 argues that effective online work in project research teams is a skill to be practiced during the learning process. Students described group work on digital research projects as the most positive aspect of interactive learning. According to researchers, universities should inspire students to use social collaboration tools such as MS Teams and Zoom because they provide good platforms for effective collaboration in small groups helping to counter isolation and increase motivation to interact with online material due to mutual accountability (Haslam et al., 2021 ).

It is widely recognized that educators represent one of the primary levers of an educational institution’s research capacity improvement (Shehzad et al., 2014 ), while university policies, practices, and resources significantly affect researchers’ productivity (Huenneke et al., 2017 ). Since learning plays a major role in skill development, teacher education curricula should cover the latest topics to meet the requirements of the new generation of educators. Correspondingly, this requires developing new guidelines that incorporate innovative approaches such as blended learning, flipped learning, and e-learning. Although many teacher education institutions of this day lack the necessary infrastructure and technical support, management should not give up on finding ways to encourage the integration of new technological tools into the curriculum to meet the demands of 21st -century students (Gupta, 2019 ). At the same time, incorporating digital pedagogy into teacher education programs should be done after ensuring that technological solutions are adequate to the new approaches toward learning and instill a positive attitude toward their adoption. Such a move would help future social pedagogues learn innovative teaching strategies and thus build their confidence in digital pedagogy as part of their future pedagogical practice (Sailin & Mahmor, 2017 ). Effective implementation of digital pedagogy depends on students’ acceptance of e-learning tools which are in turn influenced by technology (ease-of-use, speed availability, and service delivery), organization (learning support), environment (user attitudes), and impact-related factors (learning experience, skill development, performance, degree of engagement) (Eze et al., 2020 ). The dominant factor that requires attention in understanding the effectiveness of e-learning implementation in higher education is the organizational aspect, which manifests itself in creating a working culture and establishing policies mandatory for the academic community when conducting e-learning (Priatna et al., 2020 ). These factors are also worth considering when designing programs for the integration of technological tools and platforms into the research capacity building strategy of social pedagogues.

The results of this study confirmed that high-quality and productive research activities of future social pedagogues in e-learning format are achieved through the integration of digital tools and platforms into learning, the development of multimedia learning content, the implementation of structured research practices, and the provision of active learning opportunities. The steps of research capacity building of future social pedagogues in the online environment are as follows:

5 Conclusions

Implementing the proposed model for social pedagogues’ research capacity building within the framework of an e-learning course can ensure effective preparation of future teachers in a digital setting and result in a high level of their pedagogical excellence. Most of the students confirmed that digital opportunities for interaction provided by Moodle e-learning increased their level of engagement, motivation, and persistence in achieving educational goals, as well as influenced the development and consolidation of digital and media literacy skills. The ease of using the online platform and digital tools had a positive influence on students’ perception of learning, their level of intrinsic satisfaction, and learning motivation. By reducing the level of cognitive load, learning activities in a digital environment allowed students to focus on real learning tasks. Clear research concepts, strategies, and ideas presented in the form of media guidelines and instructions allowed forming a comprehensive understanding of research work in the practice of a modern social pedagogue. Active interaction strategies facilitated research persistence and productivity. Instructors’ innovative and creative approach had a favorable effect on the uptake of pedagogical skills and engaged participants in the search for effective tools for learning interactions.

Students’ assessment of the factors affecting the quality of a comprehensive research capacity building model allowed the inference the most influential of them are learning with modern digital tools and platforms (75%), e-learning platform quality (83%), clear guidance and instructions on achieving research goals (89%), social presence and collaboration (89%), active learning (74%), effective communication (91%), and personalization (96%).

In general, students’ perception of the learning practices based on the integration of digital tools and research approaches was positive. At the end of the educational experiment, most respondents confirmed the educational value of the course. This implies that the proposed model for research capacity building in a digital educational environment is worth to be implemented within teacher education. The prospects for future research are seen in studying the effectiveness of the proposed model in the structure of comprehensive programs aiming to prepare future pedagogues.

Availability of data and material

Data will be available on request.

Code Availability

Not applicable.

Ahmad, N., Hoda, N., & Alahmari, F. (2020). Developing a cloud-based mobile learning adoption model to promote sustainable education. Sustainability , 12, 3126. https://doi.org/10.3390/su12083126

Article   Google Scholar  

Alvunger, D., & Wahlström, N. (2018). Research-based teacher education? Exploring the meaning potentials of Swedish teacher education. Teachers and Teaching , 24(4), 332–349. https://doi.org/10.1080/13540602.2017.1403315

Caingcoy, M. (2020). Research capability of teachers: Its correlates, determinants and implications for continuing professional development. Journal of World Englishes and Educational Practices , 2(5), 1–11. https://doi.org/10.2139/ssrn.3631867

Cunha, M. N., Chuchu, T., & Maziriri, E. T. (2020). Threats, challenges, and opportunities for open universities and massive online open courses in the digital revolution. International Journal of Emerging Technologies in Learning , 15(12), 191–204. https://doi.org/10.3991/ijet.v15i12.13435

Dangwal, K., & Srivastava, S. (2016). Digital pedagogy in teacher education. International Journal of Information Science and Computing , 3(2), 67–72. https://doi.org/10.5958/2454-9533.2016.00008.9

Espinoza-Figueroa, F., Vanneste, D., Alvarado-Vanegas, B., Farfán-Pacheco, K., & Rodriguez-Giron, S. (2021). Research-based learning (RBL): Added-value in tourism education. Journal of Hospitality, Leisure, Sport & Tourism Education , 28, 100312. https://doi.org/10.1016/j.jhlste.2021.100312

Eze, S., Chinedu-Eze, V., Okike, C., & Bello, A. (2020). Factors influencing the use of e-learning facilities by students in a private Higher Education Institution (HEI) in a developing economy. Humanities and Social Sciences Communications , 7, 133. https://doi.org/10.1057/s41599-020-00624-6

Greenhow, C., & Askari, E. (2017). Learning and teaching with social network sites: A decade of research in K-12 related education. Education and Information Technologies , 22(2), 623–645. https://doi.org/10.1007/s10639-015-9446-9

Griffioen, D. (2020). Building research capacity in new universities during times of academic drift: lecturers professional profiles. Higher Education Policy , 33(2), 347–366. https://doi.org/10.1057/s41307-018-0091-y

Gupta, D. (2019). Capacity building of teacher educators for e-learning tools: An experimental study. Indian Journal of Educational Technology , 1(2), 1–13. https://doi.org/10.13140/RG.2.2.26525.87524

Haslam, C. R., Madsen, S., & Nielsen, J. A. (2021). Problem based learning during the COVID 19 pandemic. Can project groups save the day? Communications of the Association for Information Systems , 48, 161–168. https://doi.org/10.17705/1CAIS.04821

Hendradi, P., Khanapi, M., & Mahfuzah, S. M. (2019). Cloud computing-based e-learning system architecture in education 4.0. Journal of Physics: Conference Series , 1196 (1), 012038. https://doi.org/10.1088/1742-6596/1196/1/012038

Heng, K., Hamid, M. O., & Khan, A. (2020). Factors influencing academics’ research engagement and productivity: A developing countries perspective. Issues in Educational Research , 30(3), 965–987

Google Scholar  

Huenneke, L. F., Stearns, D. M., Martinez, J. D., & Laurila, K. (2017). Key strategies for building research capacity of university faculty members. Innovative Higher Education , 42(5), 421–435. https://doi.org/10.1007/s10755-017-9394-y

Hussein, L. A., & Hilmi, M. F. (2020). Cloud computing based e-learning in Malaysian universities. International Journal of Emerging Technologies in Learning , 15(8), 4–21. https://doi.org/10.3991/ijet.v15i08.11798

Korucu, A. T., & Atun, H. (2017). Use of social media in online learning. In Handbook of Research on Innovative Pedagogies and Technologies for Online Learning in Higher Education (pp. 1–18). IGI Global. https://doi.org/10.4018/978-1-5225-1851-8.ch001

Liu, Z. Q., Dorozhkin, E., Davydova, N., & Sadovnikova, N. (2020). Effectiveness of the partial implementation of a cloud-based knowledge management system. International Journal of Emerging Technologies in Learning , 15(13), 155–171. https://doi.org/10.3991/ijet.v15i13.14919%0d

Ma, C., & Au, N. (2014). Social media and learning enhancement among Chinese hospitality and tourism students: A case study on the utilization of Tencent QQ. Journal of Teaching in Travel & Tourism , 14(3), 217–239. https://doi.org/10.1080/15313220.2014.932483

Makovec, D. (2018). The teacher’s role and professional development. International Journal of Cognitive Research in Science, Engineering and Education , 6(2), 33. https://doi.org/10.5937/ijcrsee1802033M

Manongsong, M. J. G., & Panopio, E. (2018). Dentistry faculty members’ research competencies and attitude towards research engagement. Asia Pacific Journal of Education, Arts and Sciences , 5(3), 13–19

Miethlich, B., Kvitka, S., Ermakova, M., Bozhko, L., Dvoryankin, O., Shemshurina, S., & Kalyakina, I. (2020). Correlation of educational level, labor potential and digital economy development in Slovakian, Ukrainian and Russian experience. TEM Journal , 9(4), 1597. https://doi.org/10.18421/TEM94-35

Murray, J., & Vanassche, E. (2019). Research capacity building in and on teacher education: developing practice and learning. Nordisk Tidsskrift for Utdanning Og Praksis , 13(2), 114–129. https://doi.org/10.23865/up.v13.1975

Narikbaeva, L. M., & Savenkov, A. I. (2016). Pedagogical system of students vocational ability development. International Journal of Environmental and Science Education , 11(9), 3013–3024. https://doi.org/10.12973/ijese.2016.732a

Naveed, Q. N., & Ahmad, N. (2019). Critical success factors (CSFs) for cloud-based e-learning. International Journal of Emerging Technologies in Learning , 14(1), 140–149. https://doi.org/10.3991/ijet.v14i01.9170

Phunga, T. L., Leb, Q. T., & Buic, T. T. (2020). Improving Teaching capacity for teachers of social sciences before requesting the innovation of the general education program. International Journal of Innovation, Creativity and Change , 11(3), 527–541

Pirozhkova, I. (2021). Higher education for sustainable development: Research-based learning (the case of the Ural State University of Economics). In E3S Web of Conferences (Vol. 296, p. 08028). EDP Sciences. https://doi.org/10.1051/e3sconf/202129608028

Priatna, T., Maylawati, D. S., Sugilar, H., & Ramdhani, M. A. (2020). Key success factors of e-learning implementation in higher education. International Journal of Emerging Technologies in Learning , 15(17), 101–114. https://doi.org/10.3991/ijet.v15i17.14293

Robinson, H., Kilgore, W., & Bozkurt, A. (2020). Learning communities: Theory and practice of leveraging social media for learning. In G. Durak & S. Çankaya (Eds.), Managing and Designing Online Courses in Ubiquitous Learning Environments (pp. 72–91). IGI Global. https://doi.org/10.4018/978-1-5225-9779-7.ch004

Sailin, S. N., & Mahmor, N. A. (2017). Create-share-collaborate: An instructional strategy for developing student teacher’s critical thinking. In 1st Inspirational Scholar Symposium Proceedings (pp. 66-81). Universiti Utara Malaysia

Sailin, S., & Mahmor, N. (2018). Improving student teachers’ digital pedagogy through meaningful learning activities. Malaysian Journal of Learning and Instruction , 15(2), 143–173. https://doi.org/10.32890/mjli2018.15.2.6

Selviandro, N., & Hasibuan, Z. A. (2013). Cloud-based e-learning: A proposed model and benefits by using e-learning based on cloud computing for educational institution. In Information and Communication Technology-EurAsia Conference (pp. 192–201). Springer. https://doi.org/10.1007/978-3-642-36818-9_20

Shehzad, U., Fareed, Z., Zulfiqar, B., Shahzad, F., & Latif, H. S. (2014). The impact of intellectual capital on the performance of universities. European Journal of Contemporary Education , 10, 273–280. https://doi.org/10.13187/ejced.2014.10.273

Sivalingam, D. R., Balachandar, R., & Ajith, P. (2018). E-Learning approach in teacher education. Journal of Applied and Advanced Research , 3(1), 14–16. https://doi.org/10.21839/jaar.2018.v3is1.159

Susiani, T., Salimi, M., & Hidayah, R. (2017). Research Based Learning (RBL): How to improve critical thinking skills? SHS Web of Conferences , 42 , 00042. https://doi.org/10.1051/shsconf/20184200042

Survio online platform (2021). Official web site. https://www.survio.com/

Tamban, V. E., & Maningas, O. B. (2020). Research capability of public school teachers: A basis for research capability enhancement program. PEOPLE: International Journal of Social Sciences , 6(1), 222–235. https://doi.org/10.20319/pijss.2020.61.222235

Toktarova, V., & Semenova, D. (2020). Digital pedagogy: analysis, requirements and experience of implementation. Journal of Physics: Conference Series , 1691 , 012112. https://doi.org/10.1088/1742-6596/1691/1/012112

Toom, A., Krokfors, L., Kynäslahti, H., Stenberg, K., Maaranen, K., Jyrhämä, R. … Kansane, P. (2008). Exploring the essential characteristics of research-based teacher education from the viewpoint of teacher educators. In B. Åstrand, E. Eisenschmidt, B. Hudson, M. Lampere, & P. Zgaga (Eds.), Proceedings of Second Annual Teacher Education Policy in Europe Network (TEPE) Conference: Mapping the Landscape and Looking to the Future (pp. 166–179)

Valverde-Berrocoso, J., Garrido-Arroyo, M. D. C., Burgos-Videla, C., & Morales-Cevallos, M. B. (2020). Trends in educational research about e-learning: A systematic literature review (2009–2018). Sustainability , 12(12), 5153. https://doi.org/10.3390/su12125153

Velichová, L., Orbánová, D., & Kúbeková, A. (2020). The COVID-19 pandemic: Unique opportunity to develop online learning. TEM Journal , 9(4), 1633–1639. https://doi.org/10.18421/TEM94?40

Yulhendri, Y., Syofyan, E., & Afridona, S. (2018). The development of research-based learning model and journal as for graduate students’ scientific publication of M.Pd.E on economic. International Journal of Scientific and Research Publications , 8(5), 500–505. https://doi.org/10.29322/IJSRP.8.5.2018.p7764

Zainuddin, S., Dewantara, D., Mahtari, S., Nur, M., Yuanita, L., & Sunarti, T. (2020). The correlation of scientific knowledge-science process skills and scientific creativity in creative responsibility based learning. International Journal of Instruction , 13(3), 307–316. https://doi.org/10.29333/iji.2020.13321a

Zidoun, Y., Dehbi, R., Talea, M., & Arroum, F. Z. A. (2019). Designing a theoretical integration framework for mobile learning. International Journal of Interactive Mobile Technologies , 13(12), 152–170. https://doi.org/10.3991/ijim.v13i12.10841

Download references

Acknowledgements

This research did not receive any specific grant from funding agencies in the public, commercial, or not-for-profit sectors.

Author information

Authors and affiliations.

Department of Social Pedagogy and Self-knowledge, L.N. Gumilyov Eurasian National University, Nur-Sultan, Kazakhstan

Rysbek Maussumbayev & Rymshash Toleubekova

Department of Psychology and Special Education, Kh. Dosmukhamedov Atyrau University, Atyrau, Kazakhstan

Karas Kaziyev

Department of Computer Science and Information Technology, Aktobe Regional University named after K. Zhubanov, Aktobe, Kazakhstan

Axaule Baibaktina & Altynshash Bekbauova

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the study conception and design. Material preparation, data collection and analysis were performed by Rysbek Maussumbayev, Rymshash Toleubekova, Karas Kaziyev. The first draft of the manuscript was written by Axaule Baibaktina, Altynshash Bekbauova and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Rysbek Maussumbayev .

Ethics declarations

Conflict of interest.

The authors declare that they have no conflict of interests.

Ethics approval

The authors declare that the work is written with due consideration of ethical standards. The study was conducted in accordance with the ethical principles approved by the Human Experiments Ethics Committee of L.N. Gumilyov Eurasian National University (Protocol No 4 of 12.08.2020).

Consent to participate

All the participants gave their written informed consent to the participation in the experiment.

Consent for publication

All the participants gave their consent to the publication of the experimentresults.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Maussumbayev, R., Toleubekova, R., Kaziyev, K. et al. Development of research capacity of a future social pedagogue in the face of digital technologies. Educ Inf Technol 27 , 6947–6966 (2022). https://doi.org/10.1007/s10639-022-10901-3

Download citation

Received : 12 May 2021

Accepted : 17 January 2022

Published : 29 January 2022

Issue Date : June 2022

DOI : https://doi.org/10.1007/s10639-022-10901-3

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Digital learning platforms
  • Online course
  • Pedagogical excellence
  • Research capacity
  • Find a journal
  • Publish with us
  • Track your research

Loading

Pasig Catholic College

We Are Persons of Character and Competence

Search PCC Website

Teachers’ research capability and productivity: basis in developing a capability building program.

Antonio L. Cruz, Ed.D. La Consolacion University Philippines (LCUP)

     The study rests on the premise that a high extent of research capability translates to a high production of researches.  Employing descriptive evaluation research design, the researcher selected the teachers of the 11 PaDSS member-schools randomly and proportionately.  Using survey and questionnaires, the study was able to provide a profile of its current state of research.  The results showed that only two have research offices and a formal allocation of a research budget which is used to send teachers to seminars and trainings.  Only eight schools regularly review and evaluate their programs once a year.  Only one school has an existing research manual, which guides the school in the implementation of its research policies and procedures.  Only one school regularly publishes a research journal.  There are only three institutions that have active researchers from their roster of teachers, and also three schools who have their teachers present researches in a forum.  For the whole population of the teachers of the PaDSS member-schools, it is important to note that only less than 20 percent are active researchers.  Their individual research capability has a “high extent” self-rating, but this figure does not translate into “high” research productivity.  While there is an unequivocal support for research from the school administration, there are still issues that need to be addressed: lack of time, absence of research training and the need for cash rewards or incentives.  There is also a need to find ways in motivating the teachers to do research, and one of these is to allocate funding for research and research-related activities.

    The results also proved that there is still so much to be desired when it comes to research capability, more so in translating this into research productivity.

Comments are closed.

Ascendens Asia Journal of Multidisciplinary Research Conference Proceedings   |   ISSN: 2529-7902

Research Capability of College Education Students and Their Academic Achievement Towards the Development of a Research Framework

  • Ricardo C. Faldas, Ed.D., LPT Philippine Christian University
  • Anabelle C. Faldas, LPT Iloilo State University of Fisheries Science and Technology

This study assessed the research capability of the College of Education students. It determined its relationship with their academic achievement. It is expected that the result of this study could be used to gauge the development of the research framework.

Likewise, this study focused on 47 respondents from the third-year and fourth-year students of the College of Education of the Iloilo University of Fisheries Science & Technology, San Matias, Dingle, Iloilo City.

The researcher used purposive sampling for the College of Education students in determining the respondents of this study.

The findings revealed that the mean distribution on the Extent of the Research Capabilities of College of Education Students is "To a Moderate Extent" at a calculated mean of 3.10 and standard deviation of 0.674, with the Research Method having the lowest mean of 3.03 and an Interpretation of "To a Moderate Extent." The indicator with the highest mean is "The Nature of Inquiry," with a mean of 3.16 and an interpretation of "To a Moderate Extent." This implies that the participants put more value and importance on the Nature of Inquiry than the three other factors on the extent of the research capabilities of College of Education students.

It can be concluded that there is a significant difference in the research capabilities of College of Education students in terms of the Nature of Inquiry, Understanding of Literature and Studies, and Research Method. Therefore, the decision is to reject the null hypothesis based on the Decision Matrix of the study. On the other hand, there is no significant difference in the research capabilities of College of Education students in terms of Interpreting Results. Moreover, there is no significant difference in the research capabilities and academic achievement of College of Education students. Therefore, the decision is to accept the null hypothesis based on the Decision Matrix of the study.

With all of the above, the researcher highly recommends a research framework that will serve as a benchmark in utilizing a research method. This could help future researchers look into research areas that need strengthening.

Copyright (c) 2023 Ascendens Asia Journal of Multidisciplinary Research Conference Proceedings

Creative Commons License

This work is licensed under a Creative Commons Attribution 4.0 International License .

Copyright & Disclaimer

Copyright© 2017

Copyright for the texts which include all issues of Ascendens Asia Journal of Multidisciplinary Research Conference Proceedings are held by the AAMJRCP, except if otherwise noted. The compilation as a whole is Copyright© by AAMJRCP, all rights reserved. Items published by AAMJRCP may be generously shared among individuals; however, they may NOT be republished in any medium without express written consent from the author(s) and advance notification of the AAMJRCP Editorial Board. For permission to reprint articles published in the AAMJRCP, please contact the Editorial Board at [email protected].

Facts and opinions published in Ascendens Asia Journal of Multidisciplinary Research Conference Proceedings (AAMJRCP) express solely the opinions of the respective authors. Authors are responsible for their citing of sources and the accuracy of their references and bibliographies. The editors cannot be held responsible for any lack or possible violations of third parties’ rights. Interested parties may also directly contact authors to request for full copies of the journal proceedings.

Information

  • For Readers
  • For Authors
  • For Librarians

©2017 by Ascendens Asia Pte. Ltd. | NLB Singapore-Registered Publisher.

More information about the publishing system, Platform and Workflow by OJS/PKP.

UND Capabilities

The NSI builds upon previous work to position UND as a leader in relevant research and education, including the following entities/units:

RIAS (Research Institute for Autonomous Systems)

A global leader in unmanned and autonomous systems research, application and policy development, we provide solutions to present and future challenges.

Research Institute for Autonomous Systems

College of Engineering & Mines

Here, at UND Engineering, we are producing world-class leaders in Computer Science, Engineering, and Geology who contribute to our state, nation, and the world.

UND Engineering

John D. Odegard School of Aerospace Sciences

The second-largest degree-granting college at UND, the John D. Odegard School of Aerospace Sciences operates one of the largest fleets of civilian aircraft in North America.

UND Aerospace

Grand Challenges

UND is pioneering discoveries in unmanned flight, harnessing the power of big data and changing the game in energy exploration and environmental sustainability. We’re fighting the nation’s raging opioid epidemic and waging war against deadly cancers and debilitating deep-brain diseases such as Parkinson’s and Alzheimer's.

Center for Cyber Security Research

The center within the College of Engineering & Mines brings together faculty, researchers, and students working in the field of cybersecurity across departments on campus.

With emerging opportunities the hope is that researchers and students from across campus can join and be supported on projects addressing known and yet unknown problems with multi-disciplinary solutions.

By clicking any link on this page you are giving your consent for us to set cookies, Privacy Information .

research capability

An official website of the United States government

Here’s how you know

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

GSA Logo

  • Explore sell to government
  • Ways you can sell to government
  • How to access contract opportunities
  • Conduct market research
  • Register your business
  • Certify as a small business
  • Become a schedule holder
  • Market your business
  • Research active solicitations
  • Respond to a solicitation
  • What to expect during the award process
  • Comply with contractual requirements
  • Handle contract modifications
  • Monitor past performance evaluations
  • Explore real estate
  • 3D-4D building information modeling
  • Art in architecture | Fine arts
  • Computer-aided design standards
  • Commissioning
  • Design excellence
  • Engineering
  • Project management information system
  • Spatial data management
  • Facilities operations
  • Smart buildings
  • Tenant services
  • Utility services
  • Water quality management
  • Explore historic buildings
  • Heritage tourism
  • Historic preservation policy, tools and resources
  • Historic building stewardship
  • Videos, pictures, posters and more
  • NEPA implementation
  • Courthouse program
  • Land ports of entry
  • Prospectus library
  • Regional buildings
  • Renting property
  • Visiting public buildings
  • Real property disposal
  • Reimbursable services (RWA)
  • Rental policy and procedures
  • Site selection and relocation
  • For businesses seeking opportunities
  • For federal customers
  • For workers in federal buildings
  • Explore policy and regulations
  • Acquisition management policy
  • Aviation management policy
  • Information technology policy
  • Real property management policy
  • Relocation management policy
  • Travel management policy
  • Vehicle management policy
  • Federal acquisition regulations
  • Federal management regulations
  • Federal travel regulations
  • GSA acquisition manual
  • Managing the federal rulemaking process
  • Explore small business
  • Explore business models
  • Research the federal market
  • Forecast of contracting opportunities
  • Events and contacts
  • Explore travel
  • Per diem rates
  • Transportation (airfare rates, POV rates, etc.)
  • State tax exemption
  • Travel charge card
  • Conferences and meetings
  • E-gov travel service (ETS)
  • Travel category schedule
  • Federal travel regulation
  • Travel policy
  • Explore technology
  • Cloud computing services
  • Cybersecurity products and services
  • Data center services
  • Hardware products and services
  • Professional IT services
  • Software products and services
  • Telecommunications and network services
  • Work with small businesses
  • Governmentwide acquisition contracts
  • MAS information technology
  • Software purchase agreements
  • Cybersecurity
  • Digital strategy
  • Emerging citizen technology
  • Federal identity, credentials, and access management
  • Mobile government
  • Technology modernization fund
  • Explore about us
  • Annual reports
  • Mission and strategic goals
  • Role in presidential transitions
  • Get an internship
  • Launch your career
  • Elevate your professional career
  • Discover special hiring paths
  • Events and training
  • Agency blog
  • Congressional testimony
  • GSA does that podcast
  • News releases
  • Leadership directory
  • Staff directory
  • Office of the administrator
  • Federal Acquisition Service
  • Public Buildings Service
  • Staff offices
  • Board of Contract Appeals
  • Office of Inspector General
  • Region 1 | New England
  • Region 2 | Northeast and Caribbean
  • Region 3 | Mid-Atlantic
  • Region 4 | Southeast Sunbelt
  • Region 5 | Great Lakes
  • Region 6 | Heartland
  • Region 7 | Greater Southwest
  • Region 8 | Rocky Mountain
  • Region 9 | Pacific Rim
  • Region 10 | Northwest/Arctic
  • Region 11 | National Capital Region
  • Per Diem Lookup

Market Research as a Service

A value-added service at no additional cost.

GSA's Market Research as a service is just a click away.

MRAS delivers meaningful market data to federal, state, and local agencies. Through FAR Part 10 compliant requests for information, sources sought, industry days, and advanced product research, MRAS collects data to help customers understand where their need fits within the GSA government-wide marketplace.

Watch a video to learn more about Market Research As a Service .

Explore our offerings

MRAS provides automated RFIs and Sources Sought for services and advanced GSA Advantage product searches.

  • Request for Information (RFI) with Market Research Report  — Request an RFI to understand GSA contracts and industry capabilities with a market report.
  • Product Research Request  — Search up to 20,000 products on GSA Advantage and receive a market report.
  • MRAS Report Archives - COMING SOON! Search thousands of previous market research reports to research your requirement (OMB Max Login required).

Request for information with market research report

Develop customized RFIs that strategically target GSA contract holders on GSA eBuy, resulting in a robust market research analysis report showing all responses received, in as little as 1-2 weeks, depending on the requirement. Within 24 hours of receiving your request, we work directly with agency POCs to review a draft RFI developed from information provided, and work side-by-side with customers to further refine the RFI until it is ready to be posted on GSA eBuy.

After the RFI closes, a Market Research Report is designed specifically to see how the GSA market can meet technical requirements and mission needs provided. Market Research Reports are designed to facilitate completing Small Business Office review forms, developing Acquisition Plans, and assisting in formulating an overall acquisition strategy.

Begin your RFI or product market research request .

Workflow - Understand the Requirements, Organize the Research Questions, Develop RFI, Engage Industry, and Deliver Report

What to expect in the MRAS process

  • Customer submits a request for an RFI.
  • Collaborate with us and develop RFI.
  • Customer approves and engages industry.
  • RFI closes and you get a market report.
  • Consult with us.

The total estimated time is one day to two weeks, depending on the requirement.

MRAS provides each customer with a unique, comprehensive, and easy-to-understand report. The report analyzes industry partner data related to the customers' needs, which gives an understanding of the results expected under the markets researched. The Market Research Report includes socio-economic, technical, capabilities, and comprehensive business information so agencies can complete their Acquisition Planning and Small Business strategies and requirements and estimate documents.

Training and events

Mras customer training on effective market research (for federal, military, state, and local).

Attention all public sector employees in the contracting and acquisition career field! In GSA's Market Research As a Service Customer Training: Effective Market Research (Fed/Military), you'll learn the importance of market research, how and when to conduct it, and how to get the best results by making your data collection methods more efficient.

You'll also delve into the regulatory nature of FAR Part 10 and how it ties into other decisions such as acquisition planning, small business set-asides, commercial items, and contract type selection.

This course includes real-life scenarios, expert instructors, and tools from GSA's market research as a service. Take this opportunity to enhance your market research skills and advance your career. This is a one-CLP credit course.

Register Today

MRAS industry training

As an industry partner, you can provide valuable market research data to inform the purchasing decisions of GSA customers. Participating in our research allows you to showcase your products and services to potential buyers and play a key role in helping government agencies make informed and efficient purchasing decisions.

But why choose our MRAS over traditional market research methods? Our research is 100 times easier and efficient, making it a great business development activity for your company. Plus, by participating in MRAS, you can reach a wider audience of government buyers and increase your visibility in the market.

We offer monthly industry training seminars to help you get started and make the most of your participation in MRAS. CLPs are not issued for these webinars.

MRAS in the news

  • Listen to the FAS Focus: Deep Dive podcast episode as we discuss GSA’s Market Research As a Service with Kevan Kivlan, Director, Customer & Stakeholder Engagements New England & FAS MRAS Program Office: https://twitter.com/i/status/150-165-6591321518081
  • January 2022: MRAS Pushes Billions in Business to the GSA Schedule
  • June 2021: MRAS Webinars Show Industry How To Work Smarter Not Harder

What our customers say

“Utilizing this resource is a great way to leverage the hard work put into establishing these contracts and provide a great benefit to the Government as a whole by simplifying the already complex process of federal acquisition. The benefit of conducting this market research gives a PCO the ability to see if there's a more expedient or cost-effective way to support their mission partner's needs. Often contracting is known as a slow and drawn-out process, and while it can be, this free tool is fast and gives a great snapshot as to what alternatives are available. MRAS is a great tool in a PCO's toolkit to have and one that should not be overlooked.”

— Capt. Jeremy A. Deorsey, Massachusetts Army National Guard

MRAS background

What MRAS is: MRAS helps customers by collecting and providing industry partner information specific to Agency needs. MRAS conducts Requests for Information, Sources Sought, and advanced product searches to help Agencies compile business, socio-economic, and technical data. Additionally, MRAS helps provide feedback from industry about Agency requirements documents, industry trends, best practices, and the best GSA contract and NAICs for their acquisition.

Cost: MRAS is a market research service GSA provides to all federal, state, and local agencies at no cost.

How to submit your market research request: Please reach out to your agency POC  so they can walk you through the MRAS process and determine the best MRAS service for your requirement. You can then fill out the MRAS Service Request Form .

For industry partners that need help with a survey

Please fill out the Industry Help Request Form .

Still have questions?

Contact your agency POC or email the MRAS team at [email protected] .

PER DIEM LOOK-UP

1 choose a location.

Error, The Per Diem API is not responding. Please try again later.

No results could be found for the location you've entered.

Rates for Alaska, Hawaii, U.S. Territories and Possessions are set by the Department of Defense .

Rates for foreign countries are set by the State Department .

2 Choose a date

Rates are available between 10/1/2021 and 09/30/2024.

The End Date of your trip can not occur before the Start Date.

Traveler reimbursement is based on the location of the work activities and not the accommodations, unless lodging is not available at the work activity, then the agency may authorize the rate where lodging is obtained.

Unless otherwise specified, the per diem locality is defined as "all locations within, or entirely surrounded by, the corporate limits of the key city, including independent entities located within those boundaries."

Per diem localities with county definitions shall include "all locations within, or entirely surrounded by, the corporate limits of the key city as well as the boundaries of the listed counties, including independent entities located within the boundaries of the key city and the listed counties (unless otherwise listed separately)."

When a military installation or Government - related facility(whether or not specifically named) is located partially within more than one city or county boundary, the applicable per diem rate for the entire installation or facility is the higher of the rates which apply to the cities and / or counties, even though part(s) of such activities may be located outside the defined per diem locality.

Logo

New DOD Advisory Group to Oversee Capability Delivery Processes; Cyrus Jabbari Quoted

New DOD Advisory Group to Oversee Capability Delivery Processes; Cyrus Jabbari Quoted

The  Department of Defense has established a new advisory group to supervise the delivery of capabilities to the field.

Stood up by Under Secretary of Defense for Research and Engineering  Heidi Shyu , a 2024  Wash100 Award winner, the Transition Tracking Action Group will  use data analytics to evaluate and enhance DOD business processes, the department said Wednesday.

DOD R&E Chief Data Officer  Cyrus Jabbari , who will serve as chair of the unit, noted that limited visibility “impedes efficiency and delivery of the best capabilities for troops and the Joint Force.”

“Our goal is to help the Department of Defense connect data systems to create a common operating picture, which will help deliver the right capability to the Joint Force,” Jabbari said. 

The TTAG will use data analytics tools infused with artificial intelligence and machine learning to carry out its function. By harnessing these technologies, the group will get a detailed overview of DOD business practices and identify areas in which modifications are necessary.

More specifically, TTAG will determine policy updates and data source revisions for tracking technology transitions; conduct research and analysis to map data relationships across the R&E lifecycle; and deal with priority tracking issues as mandated by members or higher authority.

Three major objectives — identifying opportunities and barriers in tracking technology transitions, employing data to monitor these transitions and developing requisite processes and policies — will guide TTAG’s work.

“This initiative will not only facilitate the delivery of the right capabilities to the Joint Force but also ensure efficient stewardship of taxpayer funds,” Jabbari added.

  • Cyrus Jabbari
  • Department of Defense
  • Transition Tracking Action Group

2024 Wash100 272x270 Lt. Gen. Heath Collins

MDA’s Lt. Gen. Heath Collins: FY2025 Budget Request Covers Japan Partnership to Develop Hypersonic Interceptor

Hypersonics_272X270

JHTO Selects Finalists for Hypersonic Prototype Development Challenge

Contracting_272x270

Syneren Awarded WHS Contract for Defense Tech Security Administration Support Services

research capability

IMAGES

  1. Research

    research capability

  2. The 2019 Research Capabilities Booklet Needs You!

    research capability

  3. Conceptual research framework: " developing dynamic capabilities for

    research capability

  4. Capability Maturity Model Levels

    research capability

  5. Learn from These Capability Framework Examples

    research capability

  6. 3: Example of a capability framework

    research capability

VIDEO

  1. Refine Your Research Capability

  2. Research Capability Workshop

  3. Unmatched Research Capability since 1982

  4. STAR program builds research capability

  5. Centre for Liveable Cities: Our vision, our work

  6. What is a Science DMZ?

COMMENTS

  1. Key Strategies for Building Research Capacity of University Faculty

    Universities are under pressure to increase external research funding, and some federal agencies offer programs to expand research capacity in certain kinds of institutions. However, conflicts within faculty roles and other aspects of university operations influence the effectiveness of particular strategies for increasing research activity. We review conventional approaches to increasing ...

  2. Research Capability of Teachers: Its Correlates, Determinants and

    Recently, research capability has received an overwhelming and remarkable interest among academics and practitioners. This is timely since the Department of Education had institutionalized ...

  3. Building Educational Research Capacity: Challenges and Opportunities

    In accordance with research evidence suggesting the value of collaboration in building research capabilities (see Barrett et al., 2011; Christie & Menter, 2009; Jacob & Meek, 2013; Munn, 2008), it is highly recommended that educational researchers at OMU make an effort toward networking with their colleagues across the region with the aim of ...

  4. Research Capability of Teachers: Its Correlates, Determinants and

    The study assessed the research capability of public teachers in Malaybalay City and identified its correlates and determinants. It also suggested topics for continuing professional development to enhance teachers' research skills and attitudes.

  5. Developing teachers' research capacity: the essential role of teacher

    She is a former president of the Comparative and International Education Society, an Honorary Research Fellow in the Department of Education at the University of Oxford, England, and a Fellow in the American Educational Research Association. Dr. Tatto studies the effects of educational policy on school and teacher education systems.

  6. A framework to evaluate research capacity building in health care

    Building research capacity in health services has been recognised internationally as important in order to produce a sound evidence base for decision-making in policy and practice. Activities to increase research capacity for, within, and by practice include initiatives to support individuals and teams, organisations and networks. Little has been discussed or concluded about how to measure the ...

  7. What does 'strengthen research capacity' mean in practice?

    The actions should be led, owned and sustained by them. Research consortia could be a powerful mechanism for sustainably strengthening organisations' research capacity and for creating supportive environments for their research workforce. To realise this potential, funders should be more proactive in requiring consortia to carry out ...

  8. Measuring the outcome and impact of research capacity strengthening

    Introduction. Research capacity strengthening (RCS) has been defined as the "process of individual and institutional development which leads to higher levels of skills and greater ability to perform useful research" 1.National capacity to generate robust, innovative and locally appropriate research is considered essential to population health 2, 3 and socioeconomic development 4, 5.

  9. Research capacity building frameworks for allied health professionals

    Background Building the capacity of allied health professionals to engage in research has been recognised as a priority due to the many benefits it brings for patients, healthcare professionals, healthcare organisations and society more broadly. There is increasing recognition of the need for a coordinated multi-strategy approach to building research capacity. The aim of this systematic review ...

  10. PDF Predictors of Teacher Educators' Research Productivity

    research capability as a way of strengthening teacher education communities is viewed as a key factor in enhancing the quality of student and teacher learning (Arreman, 2008; Lunenberg, ... Research has also revealed that when controlling for variables associated with publishing, such as rank, number of years since receiving one's Ph.D., type ...

  11. PDF CULTIVATING RESEARCH CULTURE: CAPACITY BUILDING PROGRAM TOWARD ...

    institutionalizes the research-based decision and policy making in the department. This order stipulates that policies in the department should be based on research. Nowadays, the Department of Education encourages school personnel especially teachers to conduct action research to strengthen teacher's research capabilities.

  12. PDF The Cultivation of Research Capability for Undergraduate

    The Cultivation of Research Capability for Undergraduate 453 importance is the moral standard. Because only with morality, they are not to become a form in the whole teaching link and it can truly teach the students. 6 Activity of Student Doing research is an innovative activity itself. It needs to arouse and mobilize

  13. Development of research capacity of a future social pedagogue in the

    In recent times, research capabilities have attracted burning interest among academics and practitioners. At the core, research capability refers to an ability to conduct good-quality research in a professional field (Caingcoy, 2020). Research is an important tool for national and global progress (Tamban & Maningas, 2020). And the level of ...

  14. PDF Research Capabilities among Selected Graduate School Students in

    This study aimed to determine the research capabilities of Graduate School student-respondents of President Ramon Magsaysay State University (PRMSU) during the first Semester, School Year 2018-2019. Specifically, it sought to determine the profile of

  15. PDF Growing research capability to meet the challenges faced by ...

    interdisciplinary research; strengthening capability for research, innovation and knowledge exchange; and providing an agile response to emergencies where there is an urgent research or on-the-ground need. It forms part of UK Government's Official Development Assistance (ODA) commitment and is overseen by the Department for Business, Energy and

  16. (Pdf) Assessing the Research Capability of Senior High School Students

    The main objectives of this research are to assess the research capability of the first batch of Senior High School graduates and identify the strengths and areas for improvement of the curriculum ...

  17. Driving Forces Of Master Teachers' Research Capability: Towards

    There is a correlation between research capability and the following personal related variables such as age, length of service, teaching position, training attended, research conducted, research project involvement, research knowledge, attitude towards research and institutional support. The regression analysis showed that certain personal ...

  18. Full article: Strategies for Researching Programs' Impact on Capability

    Identifying the Research Question. Researchers who seek to assess the impact of a program or intervention on the capability of its target audience are faced with a number of specific methodological challenges. The purpose of our review is to see to what extent such challenges are recognised by researchers and, if so, what choices researchers ...

  19. (PDF) Research Capability, Support, and Difficulties as Viewed among

    Research capability in this study is the potential of individuals in the institution to undertake the rigors of effective, efficient, and high-quality research.

  20. REES AAEE special issue on engineering education research capability

    The theme of 'Engineering Education Research Capability Development' was explored in papers, presentations, and workshops throughout the REES AAEE 2021 conference. This special issue presents a continuation of that theme. The call for papers invited conference participants and engineering education researchers globally to submit their work ...

  21. Teachers' Research Capability and Productivity: Basis in Developing a

    The study rests on the premise that a high extent of research capability translates to a high production of researches. Employing descriptive evaluation research design, the researcher selected the teachers of the 11 PaDSS member-schools randomly and proportionately. Using survey and questionnaires, the study was able to provide a profile of ...

  22. Research Capability of College Education Students and Their Academic

    This study assessed the research capability of the College of Education students. It determined its relationship with their academic achievement. It is expected that the result of this study could be used to gauge the development of the research framework. Likewise, this study focused on 47 respondents from the third-year and fourth-year students of the College of Education of the Iloilo ...

  23. Getting from More to Enough: Leveraging Research, Policy, and Clinical

    5. Bandini JI, Scherling A, Farmer C, et al. Experiences with telehealth for outpatient palliative care: Findings from a mixed-methods study of patients and providers across the United States. J Palliat Med 2022;25(7):1079-1087;

  24. UND Capabilities

    With emerging opportunities the hope is that researchers and students from across campus can join and be supported on projects addressing known and yet unknown problems with multi-disciplinary solutions. UND National Security Corridor. 4201 James Ray DrGrand Forks, ND 58202P 701.777.6880 [email protected]. By clicking any link on this page you are ...

  25. Market Research as a Service

    The Market Research Report includes socio-economic, technical, capabilities, and comprehensive business information so agencies can complete their Acquisition Planning and Small Business strategies and requirements and estimate documents. Training and events MRAS customer training on effective market research (for federal, military, state, and ...

  26. Samsung Electronics Gets $6.4 Billion for Texas Chip Plants

    Expansion includes second factory, an advanced chip-packaging facility and research-and-development capabilities. By . Jiyoung Sohn. in Seoul and . Asa Fitch. in New York . April 15, 2024 5:00 am ET.

  27. (PDF) Research Capability of Faculty Members in Higher Education

    This study assessed the research capability of faculty members at Cebu Technological University (CTU)-Moalboal Campus anchoring Bandura's Self Efficacy Theory. The works of literature dictate that ...

  28. PDF House Armed Services Committee

    (research, development and acquisition) and vice admiral james pitts deputy chief of naval operations warfighting requirements and capabilities (opnav n9) and lieutenant general karsten s. heckl deputy commandant, combat development and integration commanding general, marine corps combat development command

  29. Control of Work capabilities of PSM software

    The latest Product Benchmark from Verdantix, an independent research firm, compares the capabilities and functionality of 15 PSM software vendors to help you find the right solution to keep your workers safe and improve operational efficiency. Download the report to find out more. Key Takeaways. Learn more about: The assessment criteria for CoW ...

  30. New DOD Advisory Group to Oversee Capability Delivery ...

    The Department of Defense has established a new advisory group to supervise the delivery of capabilities to the field.. Stood up by Under Secretary of Defense for Research and Engineering Heidi ...